Words employ a complex system of symbols and metaphors to
translate an internalized human sense or experience into
vocalized representations. In this capacity information
retrieval is instantaneous. Information about man and his
surrounding environment can be retrieved and represented in the
form of vocal symbols, with precision and clarity. In
Understanding Media, McLuhan refers to words as “…technology of
explicitness” and adds that “Man has the power to reverberate the
Divine thunder, by verbal translation” (57).

As empowering as the above technology may be, words, in both
oral and printed representations, are fragmentary; one cannot
express a totality of experience. On the other hand, electronic
technology, such as computing devices, evoke the mind’s ear;
that is, we are not limited to our own physical organic
extensions in experiencing the world around us. Electronic
communication systems round-out the totality of expression and
experience as they act as an extension of our nervous system.
Given that “all active media are metaphors in their power to
translate experience into new forms” (McLuhan 57), electronic
text , or etext, is a powerful metaphor for print. Etext media
transforms the experiences of the preceding print media into new
and fascinating forms.

The word processor lies at the heart of the transformation
of print media. The concept of processing words is a product of
the electric revolution. From a design perspective, a word
processor is a computer program that is specifically designed
with algorithms to manipulate objects such as text. From a
functional point of view, a word processor enables one to perform
many text editing functions which, judged by the tools available
to previous new media, impact the art of self-expression to a
greater degree.

Unlike document editing on paper, electronic error
correction is a virtual playground. Where cover-up chemicals are
required in written and print media, correcting words to entire
paragraphs in a word processing environment is performed by
electrically turning the unwanted characters off – this document
contains 534 errors which are not visible to the reader. Another
fundamental function of the word processor is that of “cut and
paste”, any piece of text, can be copied and moved from one area
of the document to another; there are no scissors or glue
involved. The etext author is literally given carte-blanche as
there are no physical boundaries to an etext document. Unlike
velum or paper, page size, justification and document formatting
are all user configurable parameters.

Adaptively, etext is content insensitive. From the
perspective of a word processing system, raw data such as textual
information can be manipulated and formatted regardless of the
type of content. As opposed to traditional print media in which
such factors as size, binding, layout, graphic interpolation,
etext is highly manipulative; its virtual format can be adapted
to many layers of textual organization. At the lowest level of
simplicity, word processors are capable of performing highly
sophisticated formatting and print functions. In short, the word
processor is a complete editing environment which has
restructured print culture. As Paul Levinson writes:

“the personal computer and its word processing, along with the
on-line network and its worldwide, instantaneous, hypertext
access, seem likely to be every bit as revolutionary and
reconfiguring of culture and human existence as the alphabet and
the printing press” (74)

Truly, in order to absorb the full impact of etext as a new
media, one must analyze the surrounding associations to other
facets of electronic media such as electronic information
retrieval, distribution, archival, multimedia, navigation, as
well as the phenomenological issues.

Information in the form of unstructured etext arises in many
situations: news items, brochures, office circulars and memos,
scientific abstracts, descriptions of projects and employees,
literary indexes, and even personal notes and memos. Information
retrieval systems are specifically designed to handle such
unstructured collections of text information which we will call
textbases. These textbase systems are different from the
conventional database systems in the way they index and retrieve
the information, and in the kinds of access that they permit.
MIT Media Lab identify three main retrieval systems classified as
the retrieval of textbased systems: (a) statistical (b) semantic
and (c) contextual/structural ( AAG 1 ). The first approach
emphasizes statistical correlations of word counts in documents
and document collections. Semantic Indexing is another example
of a statistical method to capture the term associations in
documents (AAG 2). The semantic approach to retrieval
characterizes the documents and queries so as to represent the
underlying meaning. It emphasizes natural language processing or
the use of artificially intelligent inferences .The third
approach, also known as “smart” Boolean, takes advantage of the
structural and contextual information typically available in
retrieval systems.

Etext encourages the decentralization of information.
The Internet provides a fertile ground for distributed retrieval
techniques. A number of services have surfaced on the Internet
to help users search and retrieve documents from servers around
the world: WAIS, Gopher and World Wide Web to name a few. Wide
Area Information Servers (WAIS) is a networked based document
indexing and retrieval system for textual data. The servers
maintain inverted indexes of keywords that are used for efficient
retrieval of documents (Graham 78). WAIS allows users to provide
relevance feedback to further specialize an initial query. Gopher
is primarily a tool for browsing through hierarchically organized
documents, but it also allows to search for information using
full-text indexes. In the World Wide Web (WWW), the information
is organized using the hypertext where users can explore
information by selecting hypertext links to other information.
Documents also contain indexes which the user can search for
current or archived information.

Archiving of digital information is primarily concerned with
ensuring that information in digital form endures for future
generations. The question of preserving or archiving digital
information is not a new one and has been explored at a variety
of levels over the last two decades. Archivists have perhaps been
most acutely aware of the difficulties as they have observed the
rapid and widespread shift from the use of typewriters and other
analog media to word processors, spreadsheets and other digital
means. Preserving the media on which information is
electronically recorded is well understood to be a relatively
short-term and partial solution to the general problem of
preserving digital information. Even if the media could be
physically well-preserved, rapid changes in the means of
recording, in the formats for storage, and in the software for
use threaten to render the life of information in the digital age
quite short. However, given various technical options,
preserving electronic information is not only a technical matter.
Selection is an issue common to all archiving functions.

The selection of information which demonstrated high
interoperability between analog and digital preservation is an
important goal in digital archiving. Furthermore, questions of
intellectual judgment, what information to discard and what to
carry forward in what structure and format, are always among the
more difficult issues in creating and maintaining an archive.

Etext can be distributed in a variety of popular formats. The
formats are aptly called “extensions”, which include such formats
as .HTML (Hyper Text Markup Language), .ASCII (American Standard
for Information Interchange), .txt (text) and .ps (post script)
(Graham 61). The majority of etext formats are word processor
insensitive, a format that is universal to all word processors.
With the advent of commercial word processors, etext formats have
become proprietary and will only by translated by their
respective word processing packages.

From a commercial perspective, the principle advantage of etext
distribution is not only to ensure currency, but it also allows
users or readers to print just what they need, when they need it.
Using both traditional print material, such as a book, and
on-line distribution for text material, offers the greatest
number of advantages because it dramatically enhances the
flexibility of the work as a learning tool and consequently
expands its market potential. The lure of a “free” information
is almost irresistible and will generate much media attention
and word of mouth recommendations.

Etext systems have evolved from their primitive unary existence
into multi sensorial environments. Developments in video, audio
and spatial media provided new ground for etext. The term given
to the integration of such a mosaic of media is “Multimedia”, and
is rather misunderstood as described by Michel Roy:

“Given these fundamental and perhaps unresolveable contradictions
that lie at the center of any discussion …about what is “true
multimedia” I would like to explore the these issues within the
context of the complex interplay of intellectual, social and
technical issues the come to bear n any particular project” (54)

However, from a historical perspective, electronic text based
systems evolved from utilizing glowing characters on a green
computer terminal to complimenting text with vivid audio and
video arrangement. We recall the conditions of the pre-graphic,
pre-sound, that is text based, oriented environment of the
Internet (Comer 7). Etext on the Internet experienced a
“unimedia” infancy; that is, electronic documents were presented
in a purely textual format and no meshing of media was observed.
“Gopher”, a retrieval tool discussed above, is representative of
the “unimedia” infancy as it was utilized for the delivery of
textual information of the WWW. Similarly, digital sound
reproduction as well as graphics and video utilized separate
channels for playback. With the advent of NCSA Mosaic we
observe the notion of media integration in the initial stages.
Mosaic fused the delivery of text, audio and video and gave life
to the idea of multimedia over a network.

On a local level, hardware multimedia is also representative of
stand alone computers. Peripheral devices such as CD-ROMs, high
definition audio and video cards, scanning devices, pointing
devices, and high definition video terminals and speakers, create
a rich environment for etext. A prominent example of the
interplay of multimedia software and hardware is the Groliers
Multimedia Encyclopedia in which etext is complimented by motion
picture representations and audio clips of the content. In this
form, etext is not only a metaphor for print but becomes an
allusion. A new high level language of experience is created.

Etext systems provide a rich environment for navigation of
textual information. The term “Navigation”, in the context of
popular electronic media, refers to the accessibility retrieval
and searching of relevant and topical information. Navigation of
etext systems in both distributed and locally available forms, is
a crucial evaluative criteria for the assessing of etext systems.
We must be careful not to confuse navigation with information
retrieval. Navigating through etext based systems entails a
symphony of virtual motor skills; that is, navigation, as the
term implies, must steer one to a desired destination. As such,
the end objective or the target of a search through an etext
document ceases to become the focus. Navigation to relevant
links, inferences, and bits of information, elevate print
technology to a higher degree of involvement.

Initiated in 1991, Gutenberg project is one of the first and
foremost initiatives to create a universal library. Based out of
the university of Illinois at Champaign-Urbana, the Gutenberg
project’s mission is to complete what Johann Gutenberg started;
the massive digitization of published literature. This large
scale etext project is quite revolutionary. In an environment as
rich as ours, flooded by high definition graphics and java
powered applet billboards, the Gutenberg project is powered by
plain ASCII text files. There is a blatant refusal of the robust
technology. This is etext in its most simplest form. Even the
electronic distribution techniques lack luster; they utilize file
transfer protocol (FTP) in its original UNIX implementation – no
fancy web servers.

There is a “redesign of democracy” (Hamilton 5) in the Gutenberg
project; public etext is distributed freely. Following the above
political allusion, the Gutenberg project enables “armchair”
authors to contribute etext versions of such works as the Bible,
Hamlet, and Virgil’s Aneid. Authorship, as reflected by the
publishing ethics of the Gutenberg project, is left in the hands
of the individuals who are reproducing the classics. Content
relevancy is strongly observed as individual arbitration. Denise
Hamilton captures the heart of the issue in her interview with
Michael Hart, director of the Gutenberg project, as she writes:

So in one sense, Hart is an electronic David, striking a
literary blow against an establishment Goliath that tries to
control information through restrictive copyright law,
downloading fees, and red tape. (6)

Strikingly, and perhaps unintentionally, the Gutenburg project
outlines the phenomenological effects of etext. Etext brings
about a new definition of authorship. McLuhan describes the rule
of “author-publisher” brought about by print media in The
Gutenburg Galaxy:

…so the margins also developed a merely consumer attitude to
literature and the arts, such as has lingered until this
century…The conformists inclined to the author-publisher, ruler
of the new force, It may or may not be significant that most
of the English literature since printing has been created by this
ruler-oriented minority (237)

and contrasts authorship as a manifestation of electronic media
in The Medium is The Massage:

“Authorship” – in the sense we know it today,…individual
intellect effort related to the book as an economic commodity –
…Xerography – every man’s brain-picker-heralds the times of
instant publishing. Anybody can now become both author and
publisher.” (122-123)

Etext enables man, regardless of intellect, race, or wealth, to
publish self-proclaimed works of art without literary guidelines.

We find that etext media is approached from the perspective of
that of print. As we read in The Medium is The Massage ,”we look
at the present through a rear-view mirror” (McLuhan 75).
The most visible effects of this “rear-view mirror” syndrome is
the lack of new language to express the phenomena; “Our New
Culture is striving to do the work of the old”(McLuhan 94). In
our discussion on the word processor, we observe that words like
“cut and paste”, “editing”, and “error correction” are borrowed
directly from print media. Furthermore, we also find this
phenomena present in the notion of a “Digital Library”. Many
institutions shelve, code, classify and distribute electronic
media as they would print media.

There is also a redistribution of power that comes along with the
decentralization of information. We read in The Medium is The
Massage that “print created the public, electric technology
created the masses” (McLuhan 68). The distribution and
availability of etext based documents, as discussed above, over
high speed networks has virtually eliminated an old class of
literally elite. Instead, power, as it pertains the ability to
access knowledge bases, is increasingly available (13) to the
“global-village” (qtd in Moose 140).

Our digital media transforms the way we live and communicate.
McLuhan elegantly describes the phenomena as we read in
Understanding Media:

“…since the inception of the telegraph and radio, the globe has
contracted, spatially, into a single large village. Tribalism is
our only resource since the electro-magnetic discovery. Moving
from print to electronic media we have given up an eye for an ear

Etext is a hallmark of our shifting back to a tactile mode of
group communication. As such, etext is indeed a metaphor for
print since we can now hear what we are writing.

Works Cited

Autonomous Agents Group (AAG).Information Retrieval. MIT Media
Lab. Online.
subsubsection2_5_1_1_1.html, 1997.

Comer Douglas, E. Internetwroking with TCP/IP: Principles,
Protocols and Architectures.  New Jersey: Prentice Hall, 1991.

Hamilton, D. "The Hart Of The Gutenburg Galaxy".  Online.
     http://wwww.wired.com/wired/5.02/gutenberg/, 1997.

Graham S. I. HTML Sourcebook. Toronto: John Wiley & Sons, 1995.

Jensen, Mike. "Policy Constraints to Electronic Information
Sharing in Developing    Countries" .On The Internet.
November/December, 1997: 13-15.

Kerckhove, D. The Skin Of Culture: Investigating the new
electronic reality.
     Toronto: Somerville House Publishing, 1995.

Levinson, P. The Soft Edge: A natural history and future of the
information revolution.  New York: Routledge, 1997.

Mcluhan, M. The Gutenburg Galaxy. Toronto: University of Toronto
Press, 1995.

___________.Understanding Media: The Extensions of Man.
     Cambridge: The MIT Press,1994.

___________and Fiorile, Q.  The Medium is the Massage: An
inventory of effects.
     San Fansicso:  Hardwired, 1996.

Moos, Micheal A.  "McLuhan's Language for Awareness Under
Electronic     Conditions"    in. Moos (ed)  Media Research:
technology, art, communication. Amsterdam:   Overseas Publishers
Association, 1997:140-166

Negroponte, N. Being Digital. New York: Vintage Books, 1995.

Roy, M. "How to Do Things Without Words:  The Multicultural
Promise of     Multimedia"  in E. Barrett and M. Redmond (eds)
Contextual Media: Multimedia  and Interpretation. Cambridge: The
MIT Press., 1995.