MLA 2022 Session on “New Methods to Explore Digital Archives”

Modern Language Association convention 2022
SHARP session on “New Methods to Explore Digital Archives”

Online, Friday 7 January 8.30am to 9.45am (EST).

Dr Nora C. Benedict
Assistant Professor of Spanish & Digital Humanities
Department of Romance Languages, University of Georgia

Title: “Buyers versus Borrowers: A Look at the Finances of Shakespeare and Company”

Sylvia Beach is known for her “imperfect record keeping” and often indecipherable business accounts (Fitch 161). Joshua Kotin has even gone so far as to say that it would take “[a] team of forensic accountants…to reconstruct the finances of Shakespeare and Company” (121). That said, data from her lending library cards and logbooks provides key insight into Shakespeare and Company’s cash flow. While this financial information is not always presented in a systematic or exhaustive manner, it can still be used to develop a more nuanced understanding of the inner workings of Beach’s literary enterprise. To that end, in this paper I use the Shakespeare and Company Project datasets to examine the exchange of material goods in Beach’s bookshop. More specifically, my analysis centers on the details surrounding purchases and borrows of books from the events dataset. By limiting the scope of my study to only those records that contain transactional data—whether in the form of membership fees or actual book purchases—I unearth a new array of networks that were central to the daily operations of Shakespeare and Company. As a result, in contrast with the common focus on solely the most notable lending library members (or members of the Lost Generation in general), this financial approach brings to light invisible networks and underexplored figures whose monetary contributions were essential to keeping Beach’s business afloat.

Lawrence Evalyn
PhD candidate in English
University of Toronto

Title: “Random Sampling in the Digital Archive”

One of the lessons of distant reading has been that history contains many millions more books than we can actually read. Computational literary study has learned to be explicit about textual selection, but debates about method in non-computational research are often focused on the methods of analysis or rhetorical persuasion carried out by a piece of writing, rather than the work that precedes writing, namely, discovering and reading texts. As a provocation to our expectations of method, I have taken ten entirely random titles published in England between 1789 and 1799, and close-read them for an analysis of that decade’s contentious print culture. I expected this process to be an illuminating failure, but instead have found that critical interpretation is fully capable of locating important narratives about gender, war, racial difference, and religion, even when examining a prophetic pamphlet about a lunar eclipse alongside a budget report for the East India Company. In this paper I will particularly discuss how a random sample sheds new light on eighteenth-century medical misinformation. This experiment highlights the value of embracing the true scope of what is held in digital archives, and suggests that new methods of exploring digital archives could be excitingly alien.

Dr Jennifer Burek Pierce
Associate Professor
School of Library and Information Science, University of Iowa

Title: “Finding Fictional Places on Actual Maps: A Case Study of Methods for Locating Reader Responses in the Digital World”

Matt Kirschenbaum has described the archive as “unbounded” and always in the process of creation, an apt description of digital media that document reading. These media appear on multiple platforms but are otherwise uncollected and unpreserved. Research that analyzes contemporary digital reading must respond to these conditions, particularly as individuals reconsider and remove their accounts from different platforms.

Digital mapping is a distinctive mode of reader response. Google Maps and other mapping technologies allow users to annotate professionally created maps, — a practice known as folk cartography — and readers adapt this technology to their own ends by adding fictional places from favorite books to real digital maps. One example emerges from reader response to Rainbow Rowell’s best-selling Simon Snow trilogy and readers’ decision to put her fictional Watford School of Magicks on Google Maps.1 The hidden school, a parallel to J.K. Rowling’s Hogwarts, gained reviews and images that reflected how readers read and envisioned places in Rowell’s narrative. Understanding this practice requires research methods that allow us to locate and study map-based media that document reading.

The Watford example is significant because selective imaginary places have been mapped to the real world. If we search for venues listed in Manguel and Guadalupi’s Dictionary of Imaginary Places on Google Maps, we find that many fictional places do not have digital correlates. This asks that we consider why fictional sites are selected for mapping, how they are realized on technology platforms, and how we locate them.

When fans add fictional places to real maps, their voices are inscribed and stored in ways that augment what Kirschenbaum characterizes as the “heterogeneity of digital data and its embodied inscriptions.”2 Simple searches of maps are not an effective way of finding these sites. Guidebooks, news stories, and social media help highlight these sites of reader response, a kind of triangulation. Crucially, affect, or readers’ feelings for stories, is an important cue to the sort of narratives that might be realized on maps.

Dr Zackary Turpin
Assistant Professor / Director of Graduate Studies
English Department, University of Idaho

Since Walt Whitman’s death, the rediscovery of his lost publications has been a surprisingly regular process, turning up everything from manifestos and travel writings to a men’s wellness guide and a serialized novel. The search for lost Whitman works has also evolved substantially, with major discoveries of the poet’s unknown publications coming increasingly through digital means, thanks to his extensive publication record (signed or unsigned) in more than one hundred known newspapers, as well as his fondness for reusing pen names and initialisms. The rediscovery of lost texts, however, formerly done by way of manuscript and bibliographic evidence alone, is today being augmented with new digital methodologies, which enhance researchers’ efficacy and extend their reach into digital newspaper and manuscript archives. In this presentation, I will enumerate the strengths and weaknesses of some of the newest digital methods aiding the recovery of lost Whitman publications, including byline searches, metadata triangulation, computational stylometry, and idiolectic analysis. Such methods may turn up any number of lost texts, including not one but two Whitman novels that may still be missing, The Sleeptalker (ca. 1850-51) and Proud Antoinette (ca. 1858-60).

1 @rainbowrowell, “What do you do … Penny’s mom is going to be so PISSED” Twitter (15 Sept. 2019); @rainbowrowell, “I guess I should leave a review” Twitter (15 Sept. 2019): n

2 Kirschenbaum, Mechanisms, 6.

MLA 2021 Session on “Towards Sustainability for Digital Archives and Projects”

Modern Language Association convention 2021
SHARP session on “Towards sustainability for digital archives and projects”

Sunday 10 January, 5.15pm to 6.30pm

Dr Chelsea Gunn, University of Pittsburgh, USA
Dr Alison Langmead, University of Pittsburgh, USA
Dr Aisling Quigley, Macalester College, USA


Over the last decade, the digital humanities community has become increasingly concerned with the ongoing sustainability of digital projects. This anxiety stems in part from the realization that not all digital humanities projects have identical expectations of longevity. Several prominent works in the literature, such as Bethany Nowviskie and Dot Porter’s “Graceful Degradation Survey Findings: How Do We Manage Digital Humanities Projects through Times of Transition and Decline?” (2010) and Geoffrey Rockwell et al.’s “Burying Dead Projects: Depositing the Globalization Compendium” (2014), have been central to this intellectual exchange about the benefits of creating sustainability plans for projects that do not necessarily assume a default permanence, but that instead proactively consider each project’s most suitable longevity strategy.

With this realization has come a concomitant expectation: each digital humanities project must create its own customized sustainability plan, designed with its particular requirements in mind. And yet, few digital humanists have access to direct training on the process of creating and implementing professional-grade digital preservation and sustainability practices for their own work. To support the process of designing and implementing digital sustainability plans for this work, a team of scholars housed in the Visual Media Workshop at the University of Pittsburgh has created the Socio-Technical Sustainability Roadmap (STSR; The STSR is a structured, process-oriented workshop, inspired by design thinking and collaborative learning approaches. This workshop, which may be implemented in a variety of institutional contexts, guides project stakeholders through the practice of creating effective, iterative, ongoing digital sustainability strategies that address the needs of both social and technological infrastructures. It is founded on the fundamental assumption that, for sustainability practices to be successful, project leaders must keep the changing, socially-contingent nature of both their project and their working environment(s) consistently in mind as they initiate, maintain, and support their own work. For this panel, we contextualize and describe the STSR, and provide reflections based on our experiences facilitating Sustaining DH: An NEH Institute for Advanced Topics in the Digital Humanities.

Works Cited:

Nowviskie, Bethany, and Dot Porter. “Graceful Degradation Survey Findings: How Do We Manage Humanities Projects Through Times of Transition and Decline?” Digital Humanities 2010, London.

Geoffrey Rockwell et al. “Burying Dead Projects: Depositing the Globalization Compendium.” Digital Humanities Quarterly 8.2 (2014).

David Underdown, The National Archives, UK

DiAGRAM: Digital Archiving Graphical Risk Assessment Model – A statistical approach to digital archive risk management and sustainability

Digital heritage is rich, complex and fragile. This material – born-digital records (in a variety of formats), web archives, digitised archival materials – is under threat from rapidly evolving technology. To a far greater extent than analogue archives, sustaining digital archives require ongoing investment in the technology of the archive’s systems and the technical skills of its staff.

The National Archives UK has taken a collaborative approach to managing digital preservation risk, bringing established statistical risk management methods into the digital heritage sphere. A combination of our staff, statisticians from the University of Warwick, and experts from five other UK archives, has allowed us to combine statistical data with expert knowledge to develop a decision support tool mapping and quantifying the risks and uncertainty in digital preservation. The project was supported by the National Lottery Heritage Fund.


  • Improves users’ understanding of the complex digital archiving risk landscape and of the interplay between digital archiving risk factors.
  • Empowers archivists to compare and prioritise very different types of threats to the digital archive: from software obsolescence to natural disaster.
  • Aids in quantifying the impact of risk events and risk management strategies on archival outcomes for use in decision making, communication with stakeholders and developing business cases for targeted action.
  • Measures the likelihood of permanent availability of digital materials as a function of renderability and intellectual control.

DiAGRAM’s foundation is a Bayesian Network – a statistical model estimating the probability of outcomes by considering conditional events (eg storage life depends on media type). Bayesian Networks are used as a foundation for decision support tools in a variety of contexts including aviation , credit scoring, and food security, and are widely used in risk assessment.

DiAGRAM was used to model The National Archives’ own digital holdings, with the outputs being used as supporting evidence in the UK government’s recent spending review, helping to secure a 12% budget increase for The National Archives for fiscal year 2021-22.

Dr Melodee Beals, Loughborough University, UK

“Breaking Silos: Ensuring the Sustainability of Digitised Newspaper Collections through Academic/Archival Collaboration”

Digitised newspaper collections are vital in preserving not only national heritage but also global news exchanges, and as well as the growing number of historical newspapers being digitised and made available online, born-digital newspapers are being added to collections in vast numbers. As such, newspaper collections offer a unique insight into the problems and opportunities of both digitised and born-digital archives.

This short paper will draw on research from the ‘Oceanic Exchanges: Tracing Global Information Networks in Historical Newspaper Repositories, 1840-1914’ project, during which academics from six countries worked closely with digitised newspaper collections including the British Library, the national libraries of Australia, Finland, Germany, Mexico, the Netherlands and New Zealand, and the aggregator Europeana, to investigate international news exchanges in the nineteenth century. As part of this work, we conducted interviews with librarians and investigated the metadata structures of the collections. Our findings reveal critical issues around sustainability, particularly the risk of losing a record of the institutional decision-making because it is not documented, and is passed by word of mouth between individual archivists. We will analyse the impact of institutional and national siloes on sustainability, and argue for the central importance of increased transparency and integration in ensuring that digital archives, and the projects that use them, remain sustainable. For this, academic/archival collaboration is central. We will introduce the Atlas of Digitised Newspapers and Metadata, a new Open Access guide to digitised newspaper collections around the world that draws on our interviews with archivists and combines information about their histories with Xpaths from the metadata, information about the historical evolution of the newspaper itself and a literature review demonstrating how researchers understand these sources. Our Atlas, which is now open to contributions, offers one model for collecting this underused and undervalued information about digitised newspaper archives and ensuring sustainability.

Dr Janelle Jenstad, University of Victoria, Canada

The Endings Project: Principles for Releasing Archivable Digital Humanities Projects.”

In 2020, the Endings Project – a collaboration between Librarians, Developers, and DH project leaders – comes to an end. Painfully aware of the fragility, temporality, and ephemerality of DH projects, our team has spent five years devising techniques to preserve and archive projects without sacrificing the dynamic features that make them readable, searchable, and interactive. The “Endings Principles for Digital Longevity” address the five components of digital projects: Data, Products, Processing, Documentation, and Release Management (1). This paper gives a brief overview of these principles, and then discusses the release model as an extension of traditional print publishing through the lens of one editorial project: The Map of Early Modern London (MoEML). In 2018, MoEML was “finished,” i.e., “endings-compliant and fully archivable.” Yet the team continues work on an anthology of early modern pageants and on its editions of John Stow’s Survey of London. How is it possible to be “finished” and “ongoing” simultaneously? Textual history teaches us that texts can be both published and fluid (2). MoEML has adopted a release management model based on editions of print works (3), in particular on our versioned edition of the 1598, 1603, 1618, and 1633 texts of the Survey. The 1633 edition claims to be “now completely finished” (4). Despite this claim, the business of surveying London in words was not finished. This model of incremental fluidity has inspired us to think about digital releases as editions. This paper – which builds on our 2017 SHARP panel – sets out a plan for multiple graceful digital “endings.”

Works Cited:

(1) The Endings Project: Building Sustainable Digital Humanities Projects.

(2) John Bryant, The Fluid Text: A Theory of Revision and Editing for Book and Screen (U of Michigan Press, 2002).

(3) “Principles: Release Management.”

(4) John Stow, Anthony Munday, Humphrey Dyson, and others. The Survey of London (London: Elizabeth Purslowe, 1633; STC 23345).

Prof. Kenneth Price, University of Nebraska-Lincoln, USA

Good Strategies and Inescapable Uncertainties in Building Sustainable Digital Archives”

The Walt Whitman Archive has been sustained for 25 years by grant support and also by knowledgeable staff; the revitalizing work of graduate and undergraduate students; and the enthusiasm and many contributions of Whitman scholars, readers, and aficionados. Beyond funding and technologies, digital projects also need a human network.

Given the fragility of digital work, what can be done to mitigate the dangers? Open access is key—making materials readily available for others to build on our work in new and complementary ways. We also must leave our creations in formats that do not require herculean efforts to preserve them. Future librarians and scholars will need to migrate materials to new operating systems, interfaces, and infrastructures. It is easier to preserve the raw data than the interface. And yet much of the scholarly argument of an archive resides precisely in the interface, where content is organized, contextualized, and packaged in ways that frame understanding and enable interpretations. The interface, unfortunately, is the aspect of our work hardest to preserve, and libraries rarely ingest digital projects in their full complexity.

We should also build so that others may advance their own work, hopefully using our efforts as a foundation for their own project, even as they may oppose the often implicit arguments embedded within an archive or edition. To enable such possibilities, we need to make our public assets and our behind-the-scenes work, too, as open as possible. And we need to document our processes and uncertainties, failures as well as successes. At least knowing a research team’s rationale for their initial treatment of material will make it easier for later generations to duplicate (or improve upon) our creation in whatever forms make the most sense in the future.

Dr Matthew K. Gold, CUNY Graduate Center, USA

Sustaining Digital Scholarly Infrastructure: The Manifold Use Case”

This presentation focuses on the sustainability practices and strategies that open-source, scholar-led community publishing platforms can pursue as they seek to sustain themselves when grant funding ends. It focuses on the sustainability infrastructures being developed for Manifold, an open-source publishing platform supported by the Andrew W. Mellon Foundation, the National Endowment for the Humanities, and a range of community partners. Mellon’s initial grant was part of a round of funding offered to university presses to explore sustainable paths forward for digital monographs.

In his 2019 report “Mind the Gap,” John Maxwell presented an overview of open publishing tools and platforms; Maxwell argued that “open publishing needs new infrastructure that incentivizes sustainability, cooperation, collaboration, and integration.” Maxwell’s articulation of the structural challenges involved in achieving sustainability have resonances with Kathleen Fitzpatrick’s recent book, Generous Thinking, which argues that institutions need to take collaborative, rather than competitive, approaches in the face of the austerity measures being experienced by many public educational systems in the wake of rampant privatization and state austerity measures.

This presentation will explore the larger sustainability context facing open-source community publishing platforms, grounding that discussion in the immediate challenges facing real-world projects like Manifold. I will share Manifold’s approach to sustainability from business, technical, institutional, infrastructural, and social perspectives. The presentation will describe how Manifold is attempting to meet the challenges involved in open-source community publishing, along with strategies that attendees can employ for their own projects.

Dr Molly Hardy, National Endowment for the Humanities, USA

Legacy Work and Funding Models for Digital Infrastructure”

Legacy was, and in many ways still is, the defining value of brick-and-mortar archives, which traditionally strive for preservation of the past to access it in the present and ensure its future. With more of this work being done in digital environments, cultural heritage practitioners are left to consider anew how to sustain collections in multiple ways. Recently, a community of interdisciplinary scholars who identify themselves as “Information Maintainers” have called for a reconsideration of the work of sustainability as dynamic and multi-faceted. The Information Maintainers argue that “Maintenance is not the opposite of change … and its primary aim and value is not to uphold stasis. We view acts of repurposing and revision or reuse as part of maintenance” (14-15).

This paper will address the role of funding agencies in digital sustainability. With seemingly contradictory temporal imperatives—grant funding is short-term and by definition finite while sustainability is long-term and aims to be infinite—grant funding can and should still play a central role in maintaining, modernizing, and sustaining digital infrastructure to ensure its central role in twenty-first century legacy work. This paper will consider models of using grant funding to sustain digital remediations of the literary historical records, such as the Walt Whitman Archive and the Early English Ballad Archive. It will also consider the use of such funding to sustain digital platforms for the humanities with the example of Humanities Commons, which is currently transitioning from its founding home at the Modern Language Association to Michigan State University. Digital sustainability, rightly understood, offers humanists a chance to consider not only what and how to do things with archives and platforms in digital environments, but also why they do them and how the to make legacy work operational in the new knowledge economy.

MLA 2020 Session on “Spenser and Digital Humanities”

Organised by the International Spenser Society and SHARP
Thursday 9 January, 3.30pm to 4.45pm, 616 (WSCC)


Joseph Loewenstein and Anupam Basu

The foundational work of the Text Creation Partnership, and the supplementary efforts of colleagues at Washington University and Northwestern, have given early modernists a richly annotated corpus: 60,000 printed books, 1.65 billion words, with each word preserved in original and regularized spelling and tagged by part of speech, and each document searchable not only by word or phrase (with plenty of flexibility for the substitution of part-of-speech placeholders), but also by literary structure, making it possible to profile literary idiosyncrasy at a range of scales.  Having already assessed the (very high) degree to which Spenser’s spelling in print conforms to roiling orthographic norms across his career, we offer a preliminary report on his lexical and syntactic profile, measured against a “small” corpus of verse — extracted from about 1500 texts printed between 1561 and 1600.  Using some simple metrics, we can start to tell you what’s distinctive about Spenser’s lexicon — not just the odd words, but the less odd ones that he uses disproportionately; we can also tell you whose verse practice clusters with his and what the vectors of similarity are.  And we will.  If there’s time, we’ll branch out towards the more difficult problem of how to move from profiling by means of lexical clustering to the more demanding task of syntactic profiling.

Craig Berry
Prosaic Diction: the Words of Spenser’s Prose

We know that, as a secretary, Spenser wrote a great deal of prose. We have his long prose treatise A View of the Present State of Ireland as well as the smaller Brief Note of IrelandAxiochus, and a couple of published letters. The purposes and audiences of these texts and their generic horizons of expectation differ in various ways from each other as well as from those of Spenser’s poetry. This paper will consider specifically how and whether Spenser’s word choices in the prose works differ from or align with the diction of his poetry.

Most Spenserians can readily think of rare or eccentric word choices, especially in the poetry, but this paper takes a different approach in which the cruxes and exceptions will be less important than large-scale trends.  This work starts with lemmatized digital texts (where the lemma is the dictionary head word leveling out all inflection and spelling variation) and applies statistical methods, notably log likelihood ratios and z-scores, to measure difference and similarity between different word collections.  No statistical background will be required to understand that having a look at words far more likely or far less likely to occur in the prose than in the poetry (or vice versa) may illuminate Spenser’s practice in ways that confirm or challenge the intuitions of experienced readers. At least tentative answers will be given to such questions as what words are unique to the prose corpus and what parts of Spenser’s poetic corpus have the greatest (or least) affinity, vocabulary-wise, with the prose.

John R. Ladd
Spenserian Digital Deformance and the Interpretive Power of Playfulness

Digital tools give researchers many ways to disassemble and reassemble literary works. Some of these rearrangements are used for straightforward analytical purposes: representing a corpus as a “bag of words” to allow statistical analyses of vocabulary, for example. However this process of taking literature apart and recombining it in new ways can be creative, even playful. Using Jerome McGann’s concept of deformance—a portmanteau of deform and performance—I will present several projects that reconfigure our understanding of Spenser’s works by presenting his verse to us in digitally-altered ways.

In one project, Spenser’s Color Wheel, I visualize Spenser’s use of different color terms in The Faerie Queene and The Shepheardes Calender, allowing the user to choose the lines that invoke a particular set of colors to construct evocative new poems. In another project, the Twitter bot @endlessmonument, I wrote a script that delivers lines of Spenser’s Epithalamion in accordance with the poem’s famously complex time-scheme: the social media reader then encounters the poem in short, temporally-fixed pieces alongside millions of other tweets. These projects grew out of my work with the Spenser Project, the digital edition of Spenser’s Complete Works. Following McGann, I will reflect on the ways in which these deformance projects help us to think about the alterations and interpretive choices of digital editing.

These deformances are more than creative side projects—by rearranging Spenser’s works in unexpected ways they direct readers’ attention to specific formal elements and authorial choices. I argue that deformance of Spenser’s work is interpretive—in coding them I made interpretive choices about what the reader should see in Spenser’s use of time and of color, and in exploring them readers are invited to spin out new interpretations of their own.

Posted in MLA

MLA 2020 SHARP Session: “Databases and Print Culture Studies”

Friday 10 January,10:15 AM-11:30 AM, 303 (WSCC)


Katherine Bode is Professor of literary and textual studies at the Australian National University. Her books include A World of Fiction: Digital Collections and the Future of Literary History (2018) and Reading by Numbers: Recalibrating the Literary Field (2012).

Anthony Glinoer holds the Canada Research Chair in the history of publishing and literary sociology and is a professor at the University of Sherbrooke (Quebec). His work focusses primarily on the history of publishing (Naissance de l’Éditeur with Pascal Durand in 2005), on the study of representations of the literary life (La bohème. Une figure de l’imaginaire social in 2018) and on groups of authors and artists (L’âge des cénacles with Vincent Laisney in 2013). Anthony Glinoer also leads the Socius project, which has produced re-editions of the classics in literary social theory, re-edited or original bibliographies, and a lexicon of concepts (see the open-access site

Ted Underwood is Professor of Information Sciences and English at the University of Illinois, Urbana-Champaign. He is the author of three books. His most recent, Distant Horizons: Digital Evidence and Literary Change (Chicago, 2019) explores a corpus of more than a hundred thousand nineteenth- and twentieth-century works obtained through HathiTrust Digital Library.


Katherine Bode, “Data beyond representation: From computational modelling to performative materiality”

Computational modelling has become the central paradigm for data-rich research in literary and print culture studies (e.g. Bode World, Piper, So, Underwood). Because models are arguments about, rather than descriptions of, literary phenomena, modelling offers a richer, more flexible framework for such research than earlier, positivist approaches (Bode “Equivalence”). But what are the things-in-the-world that we model when we transition from print cultural objects, such books, to mass-digitised (and digitalised) collections? Or, to put the question another way, what connections and/or distinctions are we justified in drawing between an individual text in these collections; other texts, either referred to or on the same platform; the platform itself; the entity that created and/or owns the platform; and the wider digital and non-digital ecosystem?

This challenge is part of what Alan Liu describes as a wider transition from a regime of “rhetoric-representation-interpretation” to one of “communication-information-media.” That shift renders indefinite, even unintelligible, foundational concepts in the emerging field of digital literary and print cultural studies. Data “leaks past the margins of” rhetoric-representation-interpretation, while models are “not exactly like any of the[se] concepts …. Models, uncannily, are all, and none, of the above” (4). With the old regime “hollowed out … from the inside,” and the new one hopelessly abstract, what are we to do? Liu’s solution is to embrace (while remaining wary of) the terminology of the new regime. In asking where our new “texts” begin and/or end – and whether we should think of them as “texts” at all – I outline an alternative case for a performative materialist approach to data-rich research in literary and print cultural studies. Recognising that literary phenomena have always been performatively produced rather than self-evident enables a form of data-rich research that is adequate (but not exclusive) to the changing methods, forms, and materials of literary research.

Works Cited

Bode, Katherine. A World of Fiction: Digital Collections and the Future of Literary History. Michigan University Press, 2018.

Bode, Katherine. “The Equivalence of ‘Close’ and ‘Distant’ Reading.” Modern Language Quarterly 78.1 (2017): 77–106.

Liu, Alan. Friending the Past: The Sense of History in the Digital Age. Chicago University Press, 2019.

Piper, Andrew. Enumerations: Data and Literary Study. Chicago University Press, 2019.

So, Richard Jean. “All Models Are Wrong.” PMLA 132.3 (2017): 668–73.

Underwood, Ted. Distant Horizons: Digital Evidence and Literary Change. Chicago University Press, 2019.

Anthony Glinoer, “Developing and Using Databases in Book History. The case of the platform.”

This paper aims to present the new internet platform Archives éditoriales ( and the research partnership project of francophone publishers’ archives, which made the platform possible. This project gathers archivists and researchers from various institutions (universities, archives centers, publishers’ associations) and various regions (Belgium, Canada, France, Switzerland) in the francophone world. Our main objectives are to advance and to study archives of the publishing world between 1945 and today. Amongst the tools made available on the platform (a database of more than a thousand interviews with francophone publishers about their publishing activity, digital exhibitions, a blog, etc.), this paper will focus on the database of publishers’ archives, addressing the questions of why, how and when publishing houses tend to donate their archives to public institutions.

Ted Underwood, “Toward a Distant Reading of Reception”

The collections used by distant readers have often emphasized literary production, neglecting questions about circulation and reception that are central to book history (Bode). Book historians have addressed this gap in several ways. Anne DeWitt and Ryan Cordell have used computational methods to trace nineteenth-century literary circulation; Lynne Tatlock et al. have studied gendered reading patterns at the Muncie Public Library in the 1890s, and Peter Boot has created corpora of recent responses to fiction. But work of this kind remains difficult, especially if we seek to study reception across a long century-spanning timeline.

To make that easier, a group of scholars centered at the University of Illinois has started to build a database of journalistic responses to English-language fiction, distributed across the timeline from 1840 to 2009. The team includes Kent Chang, Yuerong Hu, Wenyi Shang, Aniruddha Sharma, Shubhangi Singhal, Jessica Witte, and myself. The reviews and review excerpts are drawn mostly from British, American, and Canadian periodicals, and from reference works like the Book Review Digest. We already have 25,000 responses, and expect to have 100,000 by the end of the academic year.

Pairing text from book reviews with the texts of the books described has allowed us to start to ask whether the claims distant readers have advanced about literary works themselves also hold true about larger systems of literary circulation. For instance, distant readers have argued that textual differences between literary works can measure the strength (or weakness) of generic boundaries. If these degrees of textual difference really tell us anything meaningful about literature’s social existence, readers’ responses to the books ought to display the same patterns of difference or similarity. The Illinois book review database has allowed us to demonstrate that this is true. On the basis of this enlarged evidence, we have begun to make a case that genre boundaries generally become clearer between 1870 and 1930.

Works Cited

Bode, Katherine. “The Equivalence of ‘Close’ and ‘Distant’ Reading; or, Toward a New Object for Data-Rich Literary History.” Modern Language Quarterly 78.1 (2017): 77-106.

Boot, Peter. “A Database of Online Book Responses and the Nature of the Literary Thriller.” Digital Humanities 2017, Montreal.

Cordell, Ryan. “Reprinting, Circulation, and the Network Author in Antebellum Newspapers.” American Literary History 27.3 (2015): 1–29.

DeWitt, Anne. “Advances in the Visualization of Data: The Network of Genre in the Victorian Periodical Press.” Victorian Periodicals Review 48.2 (2015): 161–82.

Tatlock, Lynne, et al. “Crossing Over: Gendered Reading Formations at the Muncie Public Library, 1891-1902.” March 22, 2018, Journal of Cultural Analytics.


Posted in MLA

MLA 2019 SHARP Panel: “New Perspectives in Book History”

Panel 484, 12 – 1:15 p.m. Sat. 5 Jan., Hyatt Regency Roosevelt I, Chicago, Illinois

Description: Panelists explore new perspectives that are altering the way we do book history, such as global perspectives and digital humanities, ephemeral book objects in twentieth-century book history, MercadoLibre, and the promise of 3-D technologies for book history.


Nora Benedict, Princeton U

Sheila Liming, U of North Dakota

Alison Fraser, U at Buffalo, State U of New York

Kevin O’Sullivan, Texas A&M U, College Station

Amy Chen, U of Iowa

Respondent & Presider

Erin Ann Smith, U of Texas, Dallas


Nora Benedict, “Mercadolibre and the Democratization of Books: A Critical Reading of New Material Affordances and Digital Book History”

The image of the “scholar-collector” is a constant in book history. In fact, many of the best bibliographers and book historians have unearthed crucial findings due to their ability to amass impressive collections of their own, from which they have gleaned novel findings. In our digital era, this reality is even more apparent, especially within the context of Latin America, whose rich history of book production and circulation in the twentieth century remains, for the most part, understudied. That said, the growth of online marketplaces, most notably MercadoLibre, which has virtually no competitive pressure from similar sites such as eBay or Amazon, is revolutionizing the way scholars research and engage with their materials. From never-before-seen publishers’ catalogues and type specimens books, to entire runs of rare paperback editions from the early part of the twentieth-century, MercadoLibre provides access to print materials that, more often than not, exist outside of the scope of most library catalogues – both national and international.

In this paper, I will provide an overview of my current digital project on Victoria Ocampo’s Editorial Sur, a twentieth-century Argentine publishing firm, and the ways in which my work is fundamentally shaped by not only the print materials that this firm produced, but also their (digital) availability on MercadoLibre. In other words, at the heart of my project is a concern for curating collections – along with detailed metadata about their physical features – that exist in both a physical space, and a digital, freely available space, and, in the process, giving voice to often neglected literary traditions and marginalized global publishing histories. Inherent in this dualism of MercadoLibre, and other online marketplaces, is the question of what we consider an archive in the digital era of book history, and how we ethically determine best practices for approaching and using these resources.

Sheila Liming, “The Reprint as Review: NYRB Classics Editions and the Business of Canonical Renovation”

In his influential essay “The Shaping of a Canon, U.S. Fiction 1960-1975” (1987), Richard Ohmann arrives at a stunning conclusion: nearly one fourth of all the books reviewed over a period of fifteen years by the New York Review of Books were published by the same firm, Random House. Ohmann then proceeds to show how the business of reviewing during this era shaped understandings of taste and “value” with regards to new works of literary fiction.

In the early 2000s, the New York Review of Books launched its “NYRB Classics” series, seeking to introduce reprint editions of “lost” literary classics to the contemporary literary marketplace. Where it had previously made its mark as an intellectual powerhouse through book reviewing, the NYRB now engaged in a process of re-reviewing, which involved using increasingly cheap printing technology in order to renovate the previously sacred space of the English language literary canon. In doing so, the NYRB Classics series also began to exert pressure on readers’ notions of the term “classic” which, far from a Kermodian insistence on “perpetuity” and “transcendence,” began to appear associated with neglect, disregard, or abandonment.

In this presentation, I survey the inroads that the NYRB Classics series has made into reshaping the western literary canon. I style that discussion as a kind of update to Ohmann’s findings, which were published more than thirty years ago. In particular, I focus on how the NYRB Classics series has sought to de-westernize the western canon by shining a light on forgotten portions of the world literary marketplace. This situation differs sharply from the one described by Ohmann in his 1987 article and, in my research, I use quantitative methods (including data visualization) to show how the NYRB is, today, actually revising a canon that it helped establish thirty years previously, during the first “wave” of what are now known as the “canon wars.”

Alison Frasier, “Homemade Books:  Ephemeral Book Objects in Twentieth-Century Book History”

This paper argues that twentieth-century book historians should consider ephemeral, homemade objects like scrapbooks, clippings files, and photo-albums, a feminist redetermination of what we consider to be valuable in the study of book history.  More acutely than any other type of writing, scrapbooks and clippings call into question what we traditionally understand to be the labor of making and insist on refocusing our attention to process when we consider the product (or book).  The “homemade” quality of the ephemeral objects analyzed in this paper positions them outside of the traditional publication marketplace and its attendant critics, locating them inside private, domestic spaces of production.  This removal from the literary marketplace allows their makers to create outside of the bounds of a male-dominated publication community, making and circulating works around the dominant publishing economies and exclusive historical narratives, as they claim taste-making roles (like editors and publishers) for themselves.  My paper focuses on the homemade object making of twentieth-century female poets—women who were deeply familiar with publishing and eager to explore its alternatives.  These poets make objects that have process specially coded in them, and I contend that this process of labor and language is tied up with the invisible labor of women.  While the field of book history has embraced the idea of the “book-as-object,” and has productively examined pre-twentieth-century ephemeral print publications, there is a need to examine these issues in light of twentieth-century concerns, as well as their relevance to digital books and their multimodal and user-centered platforms.

Kevin O’Sullivan, “The Bibliographical Press Anew: Leveraging 3D Technologies in Book History Pedagogy”

 In the past decade, the allied technologies of 3D scanning, 3D printing, and 3D modelling have been integral to advances in fields as diverse as biomedical research, aerospace engineering, and zoology. While these tools have only recently gained a prominent foothold in the humanities, the results here have been no less exciting. Given its concern for aspects of material culture, the interdisciplinary field of book history stands to especially benefit from this new trend. Embracing the open access ethos promoted by both the digital humanities and maker communities, the application of these technologies within our field promises to be a global democratizing force, which will change the way future scholars research and teach book history. In this new wave of digitization, libraries and museums have begun to make robust 3D data of their collections materials widely available to scholars through open repositories. As an extension, working facsimiles of once-expensive resources integral to the instruction of historical printing can now be 3D printed for a fraction of the cost. This paper will begin with a survey of these and similar efforts to extend research and instruction possibilities within book history through the application of 3D technologies. It will then turn to a consideration of the important ramifications which this new accessibility to 3D data will have for broadening the scope of who is able to participate in such research and instruction, and how they are able to do so.

Amy Hildreth Chen, “Playing around with book history: Codex Conquest and Mark”

Students learn more when they play—while the value of play often is emphasized only for those early in their education, play has a role in higher education as well. To teach book history across time and space, I developed two card games: Codex Conquest ( and Mark (under development:

Codex Conquest allows students to recognize the most important books of Western civilization by their nation, century, genre, and current monetary value. Along the way, students learn European history and the scenarios that influence the shape of institutional collections. Mark introduces students to the hallmarks of early modern visual culture by allowing them to play a variety of games with a single deck of cards comprised of printer’s marks (devices). As open educational resources (OERs), both games can be downloaded for free from their respective websites and used as is or changed to suit an instructor’s objectives. As supplemental curricula, both games can be played in a single class period.

These games are a new direction in digital humanities. Book history digital humanities often considers the value and qualities of digital editions and facsimiles or focuses attention on annotation or other approaches to scholarly editing. However, this talk offers something new: it proposes book history digital humanities should expand to consider the possibilities offered by game design.

Posted in MLA

Radical Book History

SHARP Affiliate Organization Panel at MLA
Radical Book History: People, Archives, Methods
Thursday, 5 January 2017, 5:15–6:30 p.m., Franklin 3, Philadelphia Marriott

  • This roundtable will discuss the study of “radical” book trade figures, the use of “radical” methodologies or archives. Digital humanities will be an important aspect of this discussion. Literary modernism and censorship in the twentieth century will be another common theme.
  • Amy Chen will look at the market for literary collections in the United States from 1944 forward, and its impact on the literary canon.
  • Ronan Crowley will talk about large-scale digitisation initiatives that shed light on the way James Joyce wrote Ulysses.
  • Hannah Field will discuss the issue of titles rejected from British deposit libraries and its impact on the ideal of the universal repository.
  • Laura Heffernan will look at a largely neglected figure, the editor John Rodker who collaborated with major modernists such as Ezra Pound and James Joyce.
  • Eric Loy will talk about the Henry Miller Literary Society and censorship in twentieth-century America.
  • Heidi Morse will look at American small presses that helped catalyze the spread of black feminist discourse and writing by radical women of color in the 1970s and 1980s.


A Quantitative Approach to the Canon: Literary Collection Acquisition Patterns

Amy H. Chen

A radical reconceptualization of book history requires us to think not only about how books are composed, published, and read, but also how writers’ papers, which document the creation of these books, circulate in their own market.

This paper will examine literary collection acquisition trends for the authors listed in Volume E of the Norton Anthology of American Literature using quantitative descriptive analysis with primarily nominal data. Results of this study include, but are not limited to, the demographics of writers with placed and unplaced collections; how often literary collections are given rather than sold; and what type of connections are most likely to result in an author selecting one academic library over another to hold his or her collection.

The research presented in this paper comprises three chapters in a forthcoming book to be titled Archival Bodies: The American Literary Collections Market since 1944.

‘Trieste-Zurich-Paris’: Literary Geography and Large-Scale Digitisation

Ronan Crowley

As an émigré Irishman living on the Continent during and immediately after the First World War, James Joyce wrote Ulysses (1922) in ‘Trieste-Zurich-Paris, 1914–1921’, as its final line famously proclaims. Criticism has suffered, however, from being too narrowly focused on the social networks to which this itinerary introduced the writer. Moreover, while generations of readers have noted the densely allusive nature of the novel, entirely overlooked is the role that Joyce’s migrations played in creating this multilayered, reiterative effect. Ronan Crowley’s paper, ‘“Trieste-Zürich-Paris”: Literary Geography and Large-Scale Digitisation’, focuses on the transforming print ecologies of war-torn Europe in order to trace the impact that relocation around the Continent had on the preeminent resource for Joyce’s writing: the printed material from which he derived reusable copy. Not only will such analysis sharpen our understanding of the compositional history of a modern masterpiece – revealing an even wider, more fundamental cosmopolitanism than previously suspected – but it also reveals the relationships between the print culture of the early twentieth century and the mass digitisation of this material ongoing since the early 2000s.

No Such Book: Legal Deposit, Rejected Items, and the Ideal of the Universal Repository

Hannah Field

Arguments for the legal deposit of books—the process by which a select group of libraries receives all copyrighted publications gratis, and preserves them for posterity—are typically founded upon egalitarian principles. Indiscriminate conservation of absolutely everything is legal deposit’s chief recommendation as an archival practice. However, legal deposit is also marked both by debates around what should be included in these (elite) libraries, and by a relatively unexamined history of rejection. Plays, novels, almanacs, sheet music, digital media: these materials, among others, have challenged not just the practical implementation of legal deposit, but also its catholic ideals. This paper will use titles rejected from British deposit libraries as the basis for a radical methodology for examining ephemerality, canonicity, national print cultures, and the universal repository. Examining items rejected from legal deposit brings currently high-status items (such as novels and plays) into dialogue with items that remain neglected to this day (such as almanacs and sporting manuals); it also illuminates the negative formulation of concepts of print’s value, which comprise exclusion—the ‘no-such-books’ that will not be preserved—as well as positive decisions. At the paper’s centre are copyright debates in 1818, when publishers and authors complained that deposit libraries, including those at Oxford and Cambridge, rejected too many books. Library representatives were then forced to justify their acquisition practices in the House of Commons. These parliamentary records provide an unexpected location for disavowals and defences of the period’s key print forms, including the novel, as well as for meditations on the universal repository in theory and in practice.

John Rodker and the Failures of Print

Laura Heffernan

This paper opens with an overview of the humbler histories of print recently offered by scholars such as Leah Price, Lara Cohen, and Trish Loughran. Arguing that we have over-estimated print’s power, these critics highlight instead the failure of books to furnish individual interiority, foster imagined communities, or even be read at all.  What, if anything, could be radical about these new accounts of print’s inefficaciousness?  To answer this question, this paper turns to the 1920s to consider a constellation of projects undertaken by poet, publisher, and editor John Rodker. Though Rodker collaborated with major modernists such as Pound and Joyce, he has largely been left out of literary histories of that movement — ostensibly because of the frankly sexual content of his writing and of his subscription-based Casanova Press, which produced luxury editions of historical erotica.  We might thus associate Rodker with the promises of radical print and book shop counter-culture, yet I argue that Rodker himself grappled through the 1920s with his own lived sense of print’s failures. Indeed, Rodker envisioned a future without print and a public undivided by literacy; he ceased writing and publishing himself in the early 1930s.  Drawing on his publisher’s papers, editorial correspondence, and the dream journals he kept during his psychoanalysis, I reconstitute Rodker’s experience of the limits of print and suggest how his work indicates the radical promise of book history’s own turn from authorial geniuses to uncelebrated publishers/editors/readers and, most recently, to non-readers.

The Henry Miller Literary Society: Subverting Censorship in 20th Century America

Eric C. Loy

Published in 1934 in Paris by Obelisk Press and an instant classic for the European world that produced the book, Henry Miller’s Tropic of Cancer was “immediately famous and immediately banned in all English-speaking countries” (Shapiro ix). Three decades later, Grove Press published and released an American edition, which led to dozens of obscenity lawsuits in more than twenty states—a legal quagmire not settled until 1964 by the U.S. Supreme Court decision that vindicated Tropic as a work of literature.

Through an archival excavation of original correspondence and official publications currently stored at the University of Minnesota, this presentation recounts the genesis and development of the little-known Henry Miller Literary Society (HMLS) as it represents and participates in the cultural shifts surrounding Miller’s and Tropic’s tumultuous history of reception in the United States. The society, founded by Minneapolis printer Eddie Schwartz in 1958, comprised a grassroots effort for the publication and promotion of Miller in his own country, in his own time. Examination of letters between Schwartz and Miller as well as the society’s newsletters and other publications reveal a highly motivated and coordinated campaign for the cultural and academic acceptance of Miller’s work.

Accordingly, primary documents will be presented to illustrate the society’s historical narrative of subverting literary censorship and their support for one of American literature’s most radical figures. This account of the HMLS thus engages radical book history twofold: by recovering lost or suppressed narratives of censored literature and through the proposed model of an archive of documents to tell such a story. Invited roundtable discussion will focus on the continued importance of material archives and on strategies for editing primary documents in a political context.

From Shameless Hussy to Kitchen Table: Women in Print History

Heidi Morse

The first editions of Pat Parker and Ntozake Shange’s first books, Child of Myself (1971) and For colored girls who have considered suicide/ when the rainbow is enuf (1975), share a surprising intimacy: they were both run off the same AB Dick 360 offset press in poet-publisher Alta Gerrey’s garage.  Alta’s Shameless Hussy Press, founded in 1969, produced bold chapbooks with a philosophy of minimal editing and maximum exposure.  A decade later, Barbara Smith co-founded Kitchen Table: Women of Color Press, which published key feminist texts such as Home Girls (1983) and This Bridge Called My Back (1983, 2nd ed.).  Both presses helped catalyze the spread of black feminist discourse and writing by radical women of color, but they also depended on DIY methods and a very small labor force—mostly unpaid—for production and distribution.  Alta carried boxes to local Bay Area bookstores, while Kitchen Table used longtime volunteer Lucretia Diggs’ home address for years because she was such an integral part of daily operations.  Feminist print circuits in the 1970s-80s were bound to the daily lives of the women who made them work, and scholarship on radical print history offers a unique opportunity to examine this interconnectivity.  Using the daily operations of these two presses as case studies—with archival evidence from UCSC McHenry Library (Shameless Hussy) and the Lesbian Herstory Archives (Kitchen Table)—this paper theorizes radical women’s print history as a history of radical everyday actions by women who believed in the power of print.

Into the Digital Future

SHARP Affiliate Organization Panel at MLA
“Into the Digital Future: Amazon, Apple, and Google Make Book History”
Vancouver Convention Center West 121
Thursday, 8 January 2015

The Book Trade from the Perspective of Its Businesses: Recent Developments
Daniel Raff, Univ. of Pennsylvania

This talk will survey the evolution of channels of distribution for long-form reading matter and the relationship of channel actors to their customers from the mid-1990s to the present. It will begin with the growth of “superstore” bookstore chains in the 1990s, probing the consequences of this for mall-based chains and independent bookstores and also the internal impediments to profitability and further growth the chains developed as the 90s wore on. The possible and actual histories of online bookselling will be sketched, from the early 1990s roots through the near catastrophe of the early post-millennium years to the present. The current state of play is one in which the number of independents       is much diminished, the principal mall chains have been absorbed by larger entities, Borders (with its captive mall chain) has gone bankrupt, Barnes & Noble is troubled, and Amazon’s book sales and market share are flourishing with many of the “books” it is selling being electronic files readable only on Amazon-sold and -controlled devices. The legacy publishers are very worried, with, as the recent and ongoing struggles between Amazon and selected major publishers this calendar year have shown, good reason. Amazon’s resources and competitive strategy—as these have developed, as they have the firm currently situated, and the opportunities they have created for Amazon and other collective actors going forward—will be characterized in a way that will situate the discussion in the papers by Laquintano and Sickmann to follow.

Amazon Et. Al.: Self-Publishing and the New Intermediaries
Tim Laquintano Assistant Professor of English Lafayette College

This presentation will begin by profiling the meteoric rise of self-publishing and its growing role in the contemporary publishing economy (recent estimates suggest 30% of Amazon’s best selling ebooks are self-published). Then, working from the premise that digital giants (e.g., Amazon) have become key intermediaries in the publishing chain, it will attempt to theorize, in a grounded way, how the “new intermediaries” shape the work of self-publishing ebook authors. The presentation draws on ethnographic interview data from a six-year study of seventy ebook authors to show how digital distributions systems impinge on the relationship of writers and readers. It pays particular attention to how the affordances of such systems (publishing policies, payment systems, metadata) shape the production of writers and their attempts to foster the circulation of their texts. It ultimately aims to advance a burgeoning discussion about how writers negotiate new models and possibilities for publishing.

Co-Creating Fictional Worlds Online: Hugh Howey and Kindle Publishing
Carrie Sickmann Han, Indiana University

Hugh Howey’s bestselling science fiction series, The Silo Saga, is attracting attention in the book industry for its unique online publishing history. What began as a short story published through Amazon’s Kindle Direct Publishing platform (KDP) quickly grew to a three-novel series (all first published using KDP) when an enthusiastic readership took advantage of online forums to demand more. Despite an unparalleled deal with Simon & Schuster that allows Howey to retain electronic rights to the books after they appear in print, Howey adamantly rejects any claim to exclusive rights to the fictional characters, events, and worlds he creates. He actively denounces Digital Rights Management (DRM) and encourages readers to use his fictional worlds as springboards for their commercial publications. Howey’s view of fiction as “a potentially collaborative affair” is gaining popularity with digital authors and readers, and major publishers like Amazon are responding by developing platforms that encourage readers to become co-creators of their favorite stories. By tracking Howey’s innovative use of Amazon’s newest publishing platforms, this paper will argue that we’re progressing towards a digital future that treats fiction as co-created, interactive, expanding worlds that extend beyond a single book or author.

Posted in MLA

Milton and Book History

SHARP Affiliate Organization Panel at MLA
Vancouver Convention Center, West 204
Friday, 9 January 2015

This collaborative session, proposed by the Society for the History of Authorship, Reading and Publishing and the Milton Society of America (both MLA allied organizations) highlights the fertile intersection of book history and Milton scholarship and shows how the material forms of Milton’s texts are inseparable from the meanings produced by their readers and consumers. The meaning of Milton, these panelists demonstrate, is produced not simply out of technical industry, but by social forms, ideologies, political and intellectual dispositions, as well as the creative energies of writers, translators, and book producers. The three panelists identify the various kinds of agency involved in these transactions, building on recent new understandings of the histories of reading, authorship and publishing that have challenged the view of Milton as a lonely writer. If Milton is a social writer—one of the earliest to see the potential of the printing press to expand cultural and political inclusiveness—most recent work on Milton and the history of the book has focused on ideas of authorship and on the role of the author himself. This panel highlights how the media and circumstances of dissemination constitute the meaning of Milton’s works; it thus contributes to an understanding of authorship and cultural bibliography, and it also adds original historical findings to a sociological account of Milton’s early networks.

This panel brings together three Milton scholars who apply the tools of book history and bibliography to investigate and elucidate Milton’s life, career, and works. The first paper, Blaine Greteman’s ‘Milton’s first book and the making of a print author,’ explores the first work Milton had printed, the Epitaphum Damonis, an elegy that has rarely been discussed in a print context. Yet, as Greteman argues, the poem carefully affirms, reconstitutes, and expands the social, poetic network that Milton had established in during his schooling in England and his travels abroad during the 1630s. Greteman, drawing on both archival work and his ongoing digital project, maps the circulation and production of both print and manuscript texts to illuminate the ways that the Epitaphum inaugurates Milton’s investment in the book, in print authorship, and in the poet’s robust social involvement with his world.

Nicholas von Maltzahn’s paper, ‘Who printed Areopagitica? The Press and Milton’s Paper Work’ proposes to announce a major discovery, one based on scholarship von Maltzahn is undertaking for his volume in the Oxford University Press Complete Works of John Milton (forthcoming). Although Areopagitica has enjoyed great fame as Milton’s defense of the press from pre-publication licensing, its printer has still, until now, not been identified. Von Maltzahn will identify the printer, and on that basis will revisit Milton’s conception of the press’s work in the English Revolution, with special reference to the conceptions of the labour and literary genres involved in that publication within the underground print networks for such illicit publication. Both printer and author, it will be shown, shared a pattern of commitments that were both literary and political.

While these first two papers emphasise the importance of cultural bibliographic context in Milton’s own day, the third paper, Angelica Duran’s ‘Milton’s Areopagitica: A Speech to the World,’ chronicles the translations of Areopagitica in various languages and countries in recent or contemporary settings. After giving a brief history of Areopagitica’s translation or prohibition dates into various languages (twenty languages, including French, Hungarian, Japanese, and Polish), her paper then focus on two recent cases, Spanish and Chinese, chosen because the issue of censorship in each country produced complex response to Milton’s powerful statement against pre-publication licensing. Duran explores the different cultural impacts of Areopagitica’s first publications in the vernacular in Spain (1941) and China (1991), highlighting the ways Milton’s writing engages with topical debates over censorship. This paper brings the study of book history up to the present. The 370th anniversary of his anti-censorship pamphlet Areopagitica reminds us that Milton was not only a poet, he was an activist, deeply concerned about how ideas, in the form of printed texts, circulate in society.

Stephen Dobranski, a distinguished leading researcher in the field of Milton, authorship, and the book trade, will provide a response to the panel, putting the papers’ wide chronological sweep (from 1630s England to 1990s China) in context for the study of Milton and of the history of the book.

Greg Barnhisel (co-organizer) is Associate Professor and Chair in the Department of English at Duquesne University. He is the author of James Laughlin, New Directions, and the Remaking of Ezra Pound (Massachusetts, 2005) and the forthcoming Cold War Modernists: Art, Literature, and American Cultural Diplomacy (Columbia, 2014) and is one of the editors of the journal Book History.

Sharon Achinstein (co-organizer, presider) is Professor of Renaissance Literature, University of Oxford; she will take up her position as Sir William Osler Professor of English Literature at Johns Hopkins University in July 2014. Her books have explored the histories of political communication and literature in the early modern period, and include Milton and the Revolutionary Reader (Princeton, 1994), Literature and Dissent in Milton’s England (Cambridge, 2003), and two edited collections, Literature and Toleration (Oxford, 2007), and Gender, Literature and the English Revolution (Cass, 1994), and she is currently on the Executive Committee of the Milton Society of America.

Blaine Greteman is Assistant Professor of English, University of Iowa, and is author of The Poetics and Politics of Youth in Milton’s England (Cambridge, 2013), and has published articles on Milton, Jonson, and Donne, as well as long-form political journalism. He is currently working on a digital project, “Shakeosphere: The Early Modern Social Network,” for which he earned seed funding in 2013.

Nicholas von Maltzahn, Professor of English at the University of Ottawa, is editing Areopagitica as part of his volume of Milton’s tracts on religious liberty for the Oxford University Press Complete Works of John Milton (vol. 4, forthcoming). He has published numerous studies especially of Milton and Marvell, including a book- length Andrew Marvell Chronology (Palgrave, 2005); an edition of Marvell’s Account of the Growth of Popery and Arbitrary Government (in The Prose Works of Andrew Marvell, Yale, UP, 2003); and a monograph on Milton’s History of Britain (Oxford UP, 1991). Angelica Duran is Associate Professor in English, Comparative Literature, and Religious Studies at Purdue University, author of The Age of Milton and the Scientific Revolution (Pittsburgh, 2007); editor of A Concise Companion to Milton (Blackwell, 2007); and is currently coediting Milton in Translation (under consideration). She has published articles on Milton’s reception in Spain, and has coedited a volume in comparative cultural studies, Mo Yan in Context: Nobel Laureate and Global Storyteller (forthcoming, Purdue UP, 2014).

Stephen Dobranski is Professor in the Department of English at Georgia State University, and has authored significant contributions in book history to Milton studies, his Readers and Authorship in Early Modern England (Cambridge, 2005; pbk, 2009); and Milton, Authorship, and the Book Trade (Cambridge, 1999. pbk, 2009). Author of The Cambridge Introduction to John Milton (Cambridge, 2012), Dobranski has edited Milton in Context (Cambridge, 2010) and Milton and Heresy (Cambridge, 1998; pbk, 2008).

Posted in MLA