Digital Humanities, Knowledge Complexity and the Six ‘Aporias’ of Digital Research

paper, specified "long paper"
Authorship
  1. 1. Jennifer C Edmond

    Trinity College Dublin

  2. 2. Jörg Lehmann

    Universität Tübingen (University of Tubingen / Tuebingen)

  3. 3. Mike Priddy

    Data Archiving and Networked Services (DANS)

Work text
This plain text was ingested for the purpose of full-text search, not to preserve original formatting or readability. For the most complete copy, refer to the original conference program.


The idea that Digital Humanities practitioners might provide a translational capacity within and between the arts, humanities, information and computer science, easing collaboration between these disciplines and enhancing shared results, is not a new one: in fact, there is a long tradition of conceptualising at least some digital humanists as “intermediaries,” (Edmond 2005) “translators” (Siemens et. al., 2011) or “hybrid people” (Liu et al 2007, Lutz et al 2008 cited in Siemens et. al., 2011). As the long-predicted mainstreaming of digital humanities and digital methods into arts and humanities research advances, we might expect this transformation of the digital humanities from a disruptive to a supportive force to continue. Furthermore, while some within the academy certainly view the potential industrial relevance of the digital humanities with suspicion (Allington et. al., 2016), there are also many voices from industry itself calling for the development of a more humanistic, critical dimension in the work of the ICT industry (Hern, 2018; Madsbjerg, 2017; Hartley, 2017; Copenhagen Letter, 2017; Centre for Humane Technology, 2018).
While it may therefore seem timely to explore, as Liu (2012, 2016) has called for, how the digital humanities might deliver a linchpin set of critical competencies for and reflections on the techno-social interface, how this cultural intervention into technology development might resonate with of the core tenets of DH remains unclear. This paper will introduce such a frame of reference by exploring the implications for digital humanities to be found in a corpus of 38 linked interviews about big data research. The project that developed this material, an EU-funded collaboration known as Knowledge Complexity, or KPLEX for short (

www.kplex-project.eu
), explored in depth the perspectives of and attitudes toward big data found among computer scientists, collections holding institutions, and an interdisciplinary research community reaching from philosophy to fMRI studies (emotion research). The project originally focussed on understanding unconscious bias in such research, but they also expose the depth of the misalignment between approaches to how knowledge is generated and validated across contributing disciplines.

The data the project produced therefore offers much food for thought to those of us who identify as digital humanists, as it points toward a number of key barriers commonly faced and ideally negotiated within our hybrid research space. When viewed from the perspective of the KPLEX project’s data, six distinct points of ‘aporia’ arise, places where the interviewees explicitly or tacitly exposed gulfs in epistemic culture that are clearly at the heart of the tensions between disciplines as they seek to collaborate. These gulfs in goals and understanding echo the work of digital humanists, but also expand upon and throw into relief the underlying tensions in their research. While none of these findings presents, strictly speaking, an insoluble problem, the KPLEX interviews clearly illustrate the embeddedness of these challenges in the foundations of the contributing disciplines. This entanglement with professional identities and values raises them above the level of mere barriers, to a status where a more fundamental reconsideration of the scholarship produced within such collaborations may be required. In these fundamentals we may find future avenues for DH to grow in its own right, but also to expand and reconsider its potential impact. This paper will focus its exposition on the nature of and evidence for these gaps given in the interviews, which can be briefly described as follows:

Language matters. In particular the interviews with computer scientists showed a resistance to discussing what certain key terms might mean or imply, a lack of precision that would draw criticism in a purely humanities context. This impulse weakens the potential for self-reflection in computer science but also greatly impedes successful interdisciplinary work, which may progress for extended periods on a falsely constructed sense of common understanding. While this obscurity had already been observed by Borgman (2015), the KPLEX project results provide not only empirical evidence of the phenomenon, but also of its eventual negative consequences.

Context matters. Datafication implies decontextualization, and this data/context-tradeoff is only rarely reflected in data-driven methodologies (for a notable exception see Nelson, 2017). But in humanistic enterprises context is indispensable: for an historian, for example, provenance is an all-important facet in the understanding of any source. But that which is a potentially harmful data ‘modification’ for one community is a neutral, or in fact positive, process of data ‘cleansing’ for another.

Tools and standards are
pharmaka
, giving much but taking as well. In particular, information scientists can see how the availability of certain dominant tools (like keyword searches and metadata standards) are both liberating and limiting in equal measure. Data and metadata standards can be perceived by humanists as handcuffs, limiting possible iterative adaptation of parameters, but the resulting variability and complexity stand in opposition to interoperability, aggregation, and scaling (Saklofske et al., 2015).

Data without theory is as problematic as theories without evidence. A popular notion has been proposed that big data may have delivered us to the ‘end of theory’ (Anderson, 2008), but researchers actively working at the edges of big data can see clearly that this is not the case. That said, the lack of a critical frame merely pushes much of the transparency around complex phenomena into a black box with an authority based on a potentially flawed algorithm.

The power structures of technology inhibit accommodation of analogue or hybrid narratives. Much of the humanistic source landscape is still measured in kilometres of shelving rather than terabytes of data. Because of this, digital humanities practices must be well-adapted to resisting the Matthew Effect (Merton, 1968), by which research becomes concentrated on the limited, potentially flawed data - this is not always the case outside of the humanities, however. Moreover, the struggle between ‘archival thinking’ and ‘computational thinking’ evidenced in the interviews and the conceit of routinisation raises questions of who will control cultural heritage knowledge in the future.

Humanistic competences are not taught in conjunction with digital approaches. Critical, speculative, and hermeneutic thinking - the hallmarks of the humanities - are not taught alongside empirical methodologies, and critical approaches are not systematically implemented in computational studies -- Jonathon Morgan’s analysis of the Alt-Right Movement on Twitter (2016), and the Digital Humanities’ Now ‘Editor’s Choice’ Project ‘Torn Apart / Separados’ (2018) are two rare and enlightening exceptions.

The paper will conclude with a series of reflections on how digital humanities researchers could move within their disciplines and beyond to become uniquely able to negotiate some of these critical conversations. It will also address crucial points DH can share with all interdisciplinary collaboration, such as shared data formats and structuring approaches, how misconceptions are surfaced and resolved, the place of self-reflection and methodological discussions, and the incommensurability of research questions and methodologies. In conclusion, it will offer recommendations for how each of the six aporias might be met and used to create a stronger digital humanities community and culture, fulfilling its potential as both a disruptive and productive force.

Bibliography

Allington, D., Brouillette S. and Golumbia D. (2016). Neoliberal Tools (and Archives): A Political History of Digital Humanities.
LA Review of Books, May 1st, 2016.

https://www.lareviewofbooks.org/article/neoliberal-tools-archives-political-history-digital-humanities/
(accessed 11 April 2019).

Ahmed, M. et al. (2018). Editors’ Choice: Torn Apart / Separados.
Digital Humanities Now.

http://digitalhumanitiesnow.org/2018/06/editors-choice-torn-apart-separados/
(accessed 11 April 2019).

Anderson, C. (2008). The End of Theory: The Data Deluge Makes the Scientific Method Obsolete.
Wired, June 23rd, 2008.

https://www.wired.com/2008/06/pb-theory/
(accessed 11 April 2019).

Borgman, C. L. (2015).
Big Data, Little Data, No Data. Scholarship in the Networked World. Cambridge, MA: The MIT Press.

The Centre for Humane Technology. http://humanetech.com/.

The Copenhagen Letter. https://copenhagenletter.org/.

Edmond, J. (2005). The Role of the Professional Intermediary in Expanding the Humanities Computing Base.
Literary and Linguistic Computing, Vol 20, No.3, pp.367-380.

Hartley, S. (2017).
The Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World. New York: Houghton Mifflin.

Hern, A. (2018). Tech suffers from lack of humanities, says Mozilla head.
The Guardian, October 12th, 2018.

https://www.theguardian.com/technology/2018/oct/12/tech-humanities-misinformation-philosophy-psychology-graduates-mozilla-head-mitchell-baker?CMP=share_btn_tw
(accessed 11 April 2019).

Knowledge Complexity (KPLEX).

www.kplex-project.eu
. (accessed 11 April 2019).

Liu, A. (2012). Where is Cultural Criticism in the Digital Humanities?

http://dhdebates.gc.cuny.edu/debates/text/20
(accessed 11 April 2019).

Liu, A. (2016). Drafts for Against the Cultural Singularity (book in progress).

http://liu.english.ucsb.edu/drafts-for-against-the-cultural-singularity/
(accessed 11 April 2019).

Madsbjerg, C. (2017).
Sensemaking. What Makes Human Intelligence Essential in the Age of the Algorithm. London: Little Brown.

Merton, R. K. (1968). The Matthew Effect in Science.
Science. 159 (3810): 56–63. doi:10.1126/science.159.3810.56

Morgan, J. (2016). The Radical Right and the Threat of Violence.
Medium.com

https://medium.com/@jonathonmorgan/the-radical-right-and-the-threat-of-violence-f66288ac8c4
(accessed 11 April 2019).

Nelson, L. K. (2017). Computational Grounded Theory: A Methodological Framework.
Sociological Methods & Research. DOI:

https://doi.org/10.1177/0049124117729703
(accessed 11 April 2019).

Saklofske, J., & Research Team. (2015). NewRadial: Challenging scales and standards of humanities scholarship through new knowledge environment prototypes.
Digital Studies/le Champ Numérique. DOI:

https://doi.org/10.16995/dscn.24
(accessed 11 April 2019).

Siemens, L, Cunningham, R., Duff W. and Warwick C. (2011). “A tale of two cities: implications of the similarities and differences in collaborative approaches within the digital libraries and digital humanities communities.”
Literary and Linguistic Computing, Vol. 26, No. 3, 335-348.

If this content appears in violation of your intellectual property rights, or you see errors or omissions, please reach out to Scott B. Weingart to discuss removing or amending the materials.