Genetic Criticism and Digital Editing

paper, specified "long paper"
  1. 1. Dirk Van Hulle

    University of Antwerp

  2. 2. Vincent Neyt

    University of Antwerp

Work text
This plain text was ingested for the purpose of full-text search, not to preserve original formatting or readability. For the most complete copy, refer to the original conference program.

Genetic Criticism and Digital Editing

Van Hulle

University of Antwerp, Belgium


University of Antwerp, Belgium


Paul Arthur, University of Western Sidney

Locked Bag 1797
Penrith NSW 2751
Paul Arthur

Converted from a Word document



Long Paper

automatic collation
modern manuscripts
digital scholarly editing
genetic criticism
Samuel Beckett

scholarly editing

Writing is often inspired by external source texts. Including an author’s personal library is a crucial aspect of mapping creative invention at work, which is the aim of genetic criticism. In the field of genetic criticism (the study of modern manuscripts), Raymonde Debray Genette made a useful distinction between endogenesis (the part of a composition process that involves the writing of draft versions) and exogenesis (the part of the genesis that relates to external source texts—for instance, when an author consults an encyclopedia or makes notes on a book s/he is reading). Most digital genetic editions focus on the endogenesis, but it is also possible to incorporate the exogenesis. In the context of digital scholarly editing, the combination of these two aspects of a work’s genesis involves a form of digital editing that takes into account at least three approaches: (1) a documentary, (2) a textual, and (3) an intertextual approach.
Recent developments in genetic editing (notably including the work of the TEI) have increasingly drawn due attention to the importance of the ‘document’, rather than the ‘text’, as the core of the edition. In the meantime, the textual and intertextual aspects should not be neglected, though, especially if one considers the collation of versions as a central element in any scholarly edition.
The specific challenge faced by scholarly editors collating modern manuscripts is the complexity of the documents. Autograph manuscripts typically contain numerous cancellations, substitutions, and additions, which complicate the comparison. To examine the possibility of digital collation of modern manuscripts, this paper studies the case of the Beckett Digital Manuscript Project (, a project that is made possible thanks to an ERC Grant (‘CUTS: Creative Undoing and Textual Scholarship’) and whose digital infrastructure is developed as part of the Marie Curie ITN ‘DiXiT’ (Digital Scholarly Editions).

Documentary Approach

Modern manuscripts (notes, sketches, drafts) are not ‘texts’ but ‘protocols for making a text’, as Daniel Ferrer describes them in
Logiques du brouillon: Modèles pour une critique génétique (2011). This has consequences for scholarly editing, as Hans Walter Gabler notes in ‘The Primacy of the Document in Editing’, because ‘it is documents that we have, and documents only. In all transmission and all editing, texts are (and, if properly recognised, always have been) constructs from documents. For to edit texts critically means precisely this: to construct them’ (2007, 199). As a consequence, the inclusion of digital facsimiles has almost become a sine qua non in digital scholarly editing. This trend has been a considerable help in making modern manuscripts more accessible, especially when the facsimiles are accompanied by a transcription. But if this transcription is conceived in terms of a document-oriented approach, it often treats every word as a separate ‘island’ (linked to a particular set of coordinates on the facsimile), not as part of a syntactic entity. This is where a textual approach becomes useful.

Textual Approach

In addition to the documentary approach, a textual approach can make modern manuscripts more accessible by offering a plausible interpretation of the ‘protocol’ (to employ Ferrer’s term). If a word is crossed out and an alternative is written above the line, the chronology of these writing acts can—in many cases—be reconstructed, and this editorial interpretation can be usefully employed to feed into the alignment table of the collation software. In collaboration with the Huygens ING Institute (The Hague), we integrated the automated collation software tool CollateX in the Beckett Digital Manuscript Project (BDMP) and developed a model to incorporate all of these writing acts into the collation (Haentjens Dekker et al., 2014), to be included in a next version of CollateX.
In the meantime, a new bout of intense collaboration between the lead developers of the BDMP and CollateX (using the existing version 1.5 of CollateX) has resulted in a working alternative to this theoretical model. Deleted and added words are fed into the algorithm with an added property declaring them as such, collated against the text of other witnesses and visualized in the resulting alignment table in the same way as they are visualized in our transcriptions. Complex substitutions can push the software to its limits, and to that end corrective mechanisms were developed to guide the automatic collation to produce improved results. 
Intertextual Approach
In addition to the documentary and textual approaches, a digital genetic edition may also be expected to map the relations between endo- and exogenesis. To that end, it is possible to integrate a writer’s personal library. But in many cases, only a fraction of what an author has ever read is still extant. In addition to the ‘extant’ library (based on digital facsimiles of the marginalia in the books preserved in Beckett’s apartment in Paris), the BDMP therefore tries to reconstruct and integrate a ‘virtual’ library (based on Beckett’s reading notes in notebooks). Exogenesis defines the contours and the specific form of intertextuality that will be discussed in the paper.
* * *
These three approaches enable users to compare (1) the digital facsimiles of modern manuscripts with the transcriptions, (2) the multiple versions of the drafts (by means of a digital collation tool to mark the variants), and (3) the author’s manuscripts with the source texts to which they refer or allude. Working with these three dimensions, the paper proposes a model for digital scholarly editions that map the interaction between endo- and exogenesis, formulating a set of criteria for the inclusion of various types of source texts to address the issue of a virtual library’s boundaries as one of the main challenges of exogenesis in digital editions.


Debray Genette, R. (2007). Génétique et poétique: le cas Flaubert.

Ferrer, D. (2011).
Logiques du brouillon: Modèles pour une critique génétique. Seuil, Paris.

Gabler, H. W. (2007). The Primacy of the Document in Editing.
4: 197–207.

Haentjens Dekker, R., Van Hulle, D., Middell, G., Neyt, V., and Van Zundert, J. (2014). Computer-Supported Collation of Modern Manuscripts.
Literary and Linguistic Computing,

Shillingsburg, P. (2006).
From Gutenberg to Google: Electronic Representations of Literary Texts. Cambridge University Press, Cambridge.

If this content appears in violation of your intellectual property rights, or you see errors or omissions, please reach out to Scott B. Weingart to discuss removing or amending the materials.

Conference Info


ADHO - 2015
"Global Digital Humanities"

Hosted at Western Sydney University

Sydney, Australia

June 29, 2015 - July 3, 2015

280 works by 609 authors indexed

Series: ADHO (10)

Organizers: ADHO