A Companion to Digital Humanities
Ch. 16: Marking Texts of Many Dimensions
Author: Jerome McGann
In the second paragraph of his chapter, Jerome McGann describes the process of digitizing text:
As we lay foundations for translating our inherited archive of cultural materials, including vast corpora of paper-based materials, into digital depositories and forms, we are called to a clarity of thought about textuality that most people, even most scholars, rarely undertake. (emphasis added)
This quote resonates strongly with conventional wisdom among translators, who often say that, to quote translator Giovanni Pontiero, "...the most careful reading one can give of a text is to translate it." The parallels between the way that translators and digital humanists regard their approach as more careful than that of other approaches to texts is interesting.
McGann continues the parallels between digitization and translation in his third paragraph. Compare the following quotes:
"All text is marked text..." -- McGann
"All acts of communication are acts of translation." -- George Steiner, After Babel
And later in the chapter, McGann raises another potentially interesting parallel between digitization and translation when he states that:
Two forms of critical reflection regularly violate the sanctity of such self-contained textual spaces: translation and editing... Consequently, acts of translation and editing are especially useful forms of critical reflection because they so clearly invade and change their subjects in material ways. To undertake either, you can scarcely not realize the performative - even the deformative - character of your critical agency. (emphasis added)
McGann refers to editing, not digitization, but his overall perspective seems to support the idea that digitization is a subset of editing, or that it could have been included as a third form of critical reflection that fits with translation and editing. If this is not the case, it's unclear why McGann is discussing translation and editing in this chapter.
Allopietic vs. Autopoietic Systems
Despite the interesting intertextuality inherent in his introduction, the remainder of McGann's chapter is difficult to follow. His distinction between allopoietic and autopoietic systems is far from clear. In particular, McGann doesn't seem to state clearly whether coding and markup are allopoietic or autopoietic. At one point, he says that they "appear" allopoetic, but that they "are not like most allopoietic systems" because "Coding functions emerge as code only within an autopoietic system that has evolved those functions as essential to the maintenance of its life." This seems to presuppose that code is integral to the text, which fits with the statement that "All text is marked text." However, the lack of concrete examples in this section of the chapter makes it difficult to know for certain what McGann is getting at.
The Dimensions of Textual Fields
In the section of the chapter entitled "Marking the Text: A Necessary Distinction," McGann explains why SGML-based tagging schemas are unable "to render the forms and functions of traditional textual documents." As a result, he seems to be saying that all efforts at tagging are limited by an incomplete "conception of textuality" because they are limited to the linguistic, while books "organize themselves along multiple dimensions of which the linguistic is only one." In one of the chapter's appendices, McGann describes the six dimensions of "textual fields":
- Linguistic
- Graphical/Auditional
- Documentary
- Semiotic
- Rhetorical
- Social
While the first three categories are straightforward (given that McGann defines the documentary dimension as the text's "transmission history"), the latter three are unclear. His definition of the semiotic dimension is opaque and his description of the rhetorical dimension is reductionist ("The dominant form of this dimension is genre"). The social dimension is interesting, but it is also the furthest dimension from traditional theories of textuality, according to McGann, who argues that, in these traditional theories, "the social dimension is not considered an intrinsic textual feature or function." In any case, McGann seems to dismiss the potential value of analyzing the linguistic dimension of a text without taking the other dimensions into account.
This stance seems to limit the potential utility of markup, an attitude McGann echoes in his conclusion when he asserts that "... computer markup as currently imagined handicaps or even baffles altogether our moves to engage with the well-known dynamic functions of textual works." While McGann's perspective may be interesting and/or cutting-edge, he provides new researchers in digital humanities (the presumed audience of the Companion) with few clues as to how to implement his approach. The idea of a "digital processing program... that allows one to mark and store these maps of the textual fields and then to study the ways they develop and unfold and how they compare with other textual mappings and transactions" is intriguing, but when faced with the choice between McGann's vague, idealistic approach and the far more concrete and approachable markup approach advocated by TEI, it's hard to see new digital humanists taking the road less traveled.