A little over a week ago I went on a field trip with my Digital Humanities class to the Special Collections library to look at / try out the Hinman Collator and Lindstrand Comparator (1). If you’re reading this and have no idea what either of those things are, don’t worry. I didn’t either before graduate school. But I’m told I now have “street cred” in the DH community because I’ve sort of used the Hinman Collator and Lindstrand Comparator, for all of a minute.
The opportunity to actually use these two machines, which have been used to collate texts, gave me a better understanding of just how tedious and difficult it can be to compare and combine two or more versions of the same text in search of an answer or outcome to a higher scholarly inquiry.
In my DH class we have been discussing textual studies approaches and how these approaches have been brought into the digital age. To immerse ourselves in a broad understanding of a deep subject, our readings included Wesley Rabbe’s “Harriet Beecher Stowe’s Uncle Tom’s Cabin: A Case Study in Textual Transmission”; Julie Meloni’s “A Pleasant Little Chat about XML”; Allen Renear’s “Text Encoding”; and excerpts from Jerome McGann’s Radiant Textuality (pages 53-97).
In these readings, I must admit that I found the deeper presentations / discussions of SGML, XML, and TEI far more interesting than the discussions of actual textual study (so far as I understand what textual study is). However, I think that knowing and understanding how SGML and XML–defined by Allen Renear as metalanguages (2)–does help inform the current application of textual studies in the digital age.
I took away from Renear’s article two interesting, and I think important, points regarding TEI and it’s larger purposes beyond simply encoding text for digital representation.
First, Renear describes a major purpose behind the creation of TEI–which was, at its core, an interdisciplinary, multicultural act–was to allow digital projects to share data and perform different theoretical outcomes in ways that were not inhibited by disciplinary or cultural differences. In essence, the creation of TEI was the creation of a language that accomodated “the full range of human written culture, the full range of disciplinary perspectives on those objects, and the full range of competing theories” (Renear). In this, Renear argues that “TEI doesn’t require antecedent agreement about what features of a text are important and how to tell whether they are present; instead, it makes it possible, when the scholar wishes, to communicate a particular theory of what the text is” (Renear).
To me, this suggests that there can exist in the digital sphere a representation of text(s) that are marked up for multiple disciplanary research inquiries. No longer are separate print editions necessarily needed for each separate scholarly (disciplinary) pursuit into a text. Presuming that such a mark up of a digital text is possible (and from these base readings, I do believe it would be), from this there is a potential for new avenues of scholarly inquiry to flourish as two or more disiplinary / cultural perspectives are placed within / on a text.
Related to the above, Renear also proclaims that “TEI succeeded [in the] development of a new data description language that substantially improves our ability to describe textual features, not just our ability to exchange descriptions based on current practices.” When I read this sentence in Renear’s article I immediately grabbed for Radiant Textualities to find this quote, “Translating paper-based texts into electronic forms entirely alters one’s view of the original materials” (McGann 82).
While I’m sure that there is a view-altering process inherant to all textual studies, it does seem to me there would be a distinct impact from having to consider how to represent the very basic parts (pages, paragraphs, lines, stanzas, headings) of a written document in a digital environment. Taking the paragraph as an example, it is not necessarily defined by page margins and single indentations. If a paragraph spans several pages in a print text, how is the impression of that changed when it is, potentially, completely visible on a computer screen?
Within every piece of the larger Digital Humanities we’ve studied so far this year, I’ve always come back to the question: where does the theory fit in?
Through pondering on textual studies, I believe a space for important and critical theory has been highlight by Jerome McGann:
For the truth is that all textualizations–but pre-eminantly imaginative textualities–are organized through concurrent structures. Texts have bibliographical and linguistic structures, and those are riven by other concurrencies: rhetorical structures, grammatical structures, metrical, sonic, referential. The more complex the structure the more concurrencies are set in play. (90)
With the digital age, scholarly or textual editions are not confined to represent the parts of a written document within the same confines of the printed page (McGann 67). How a digital edition may choose to represent these layers of a text is certainly important, and certainly requires theoretical understanding and application. The digital editing any text will undergo on the way to becoming a digital edition will be directed by two points of interest: interdisciplinarity and scholarly inquiry.
Renear has already made a quick case that interdisciplinary is actually quite difficult to accomplish in practice, and has argued that TEI at least begins to overcome some issues to approach in relation to the physicality of the text. The second of these points may not reveal itself until much later in the process, but inevitably, part of how one understands a written document relies on what one wants to uncover within this document. In both instances, a constant critical eye toward theory (3) may help push the field forward, and the boundaries of digital editions outward.
(1) If you want to read a long-ish article on the Hinman Collator: “‘The External Verities Verified’: Charlton Hinman and The Roots of Mechanical Collation” by Steven Escar Smith [Vol. 53]
(2) From these metalanguages, we get more specific markup languages, like HTML, XHTML, and TEI.
(3) I’m using theory to include theories about textual studies, digital editing, encoding practices, and any influential disciplinary theories that an individual editor might find useful and necessary.