Pedagogy, Agency, and Media Culture
I recently attended a conference and participated in a panel on the relationship between digital humanities and media studies. It seemed like the intention of the panel was to discuss the topic at hand while including the audience as part of the discussion. I prepared a short list of bullet points concerning what I find most problematic about much of the discussion around the digital humanities. One of my main concerns arises from what seems like a naive suggestion that using quantitative data in relation to texts is an innovation. Of course, a quick look at the historical underpinnings of DH brings forth several earlier iterations focusing on computers and the humanities all of which brought quantitative data to bear on texts (although, admittedly, not always with the same tools or techniques that are being used currently). But, a more problematic gap seems to arise when discussing the power of big data and its relationship to the humanities. There are many critical interventions over the years that address exactly how troublesome, and dominating, the use of quantitative data can become in relation to knowledge production and to scholarship and its practice.
At this panel, another scholar presented in lieu of a talk, a fairly well-developed and implemented quantitative research project that was conducted by a group of researchers at his institution. While the project was interesting on its own merits, there seemed to be an assertion that this kind of work had more rigor than comparable work in the more traditional humanities. This scholar even categorized himself as a social scientist by training and began to reel off the value of the type of project that he was presenting (as we all do with our own work). He at one point said, paraphrasing here, that the data spoke for itself. And that is what I would describe as the brute force of big data. To draw in Adorno’s critique of administrative research seems particularly apt in that his analysis of the work being done by his colleagues at Columbia sought to identify the complicity with power that developed as the quantitative overshadowed the qualitative (big data demands big dollars – to be reductive in another sense). He suggested that without a critical framework from which to develop research questions and methods – the interpretative methods for understanding the significance relating to data (small or large) – then much of the research that uses quantitative data simply works to reduce the world to a set of numbers – a brutalization of meaning and a tool for dominating one’s colleagues, students, and constituents. But it does help to make careers as we are currently seeing in the discipline of English.