This post originally was published at my personal blog, Bloviate. If you wish to comment, click on the title and add to the discussion there!
Last Wednesday Matt Gold and Charlie Edwards invited me and a few of my favorite CUNYs to come speak to the CUNY Digital Humanities Initiative, a new group at the University “aimed at building connections and community among those at CUNY who are applying digital technologies to scholarship and pedagogy in the humanities.” Matt and Charlie were especially interested in bringing CUNY educational technologists to this meeting because the relationship between edtech and the digital humanities is something that’s been assumed more than theorized: we all focus on the intersection of technology and academic work in the humanities, ergo we must be doing similar and somewhat simpatico things.
With a field that’s been as nebulous in its boundaries and definitions as the digital humanities, this stance hasn’t been particularly problematic. There has, however, been significant energy within the digital humanities over the past year devoted to self-definition. At the same time, the loose, distributed community of educational technologists working with open source publishing platforms of which I consider myself a part has congealed around a certain set of ideas. I intended my contributions to the CUNY DHI to draw some points of difference between these twined trajectories, to look upon the digital humanities through the lens of my recent experience becoming an educational technologist after completing a graduate degree in history, and ultimately to raise some questions about the tensions I see between the two realms of academic life.
In advance of the visit, we were asked to circulate some readings, and I chose Mike Neary and Joss Winn’s “The Student as Producer.” This piece contextualizes the work that I and several of my colleagues have been engaged in over these past few years. Our work as educational technologists has emerged to meet a particular nefarious challenge that Neary and Winn powerfully delineate: over the past two generations, the function of the university has been increasingly shaped in response to the forces of capital. “Since the 1980s, universities, in response to government pressure, have become more business-like and enterprising to take advantage of the ‘opportunities’ presented by the so-called global ‘knowledge economy’ and ‘information society.’” At the risk of overdrawing the picture somewhat, we see the impact of such pressures in pretty much every nook and cranny of the university: in how resources are sought and allocated, in the corporatization and professionalization of athletics, in the anxiety over assessment and accreditation, in the structure and vicissitudes of the academic labor market, in the predatory student loan and credit card industry and, not least of all, in the classroom, where structures of instruction commonly lead to students being treated as vessels into which information should be dumped en route to the job market.
Blogs@Baruch and its sister projects emerged in direct response to these conditions. Our original focus was on nurturing student-centered learning by merging WAC and WID principles with the possibilities opened up by online publishing, in making more visible the pedagogy (both successful and not) at work in our classrooms, and at supporting an alternative to the proprietary course management system that still predominates across CUNY. Blackboard is itself an embodiment of the university culture that Neary and Winn rightly find so troubling: students cycle through a system that structurally, aesthetically and rhetorically reinforces the notions that education is consumption, the faculty member is a content provider, the classroom is hierarchical, and learning is closed. Less and less though do we have to convince listeners that open source publishing platforms and the many flowers they’ve allowed to bloom can create exciting possibilities in and beyond the classroom; we can show them link after model after link after model after link.
And yet our argument has quickly expanded beyond the classroom to engage broader questions about curricula, the social life of the University, the very way that our community members think about their experiences. Our engagement is a humanistic one in that it insistently constructs the university first and foremost as a site of inquiry and exploration, resists and complicates the concepts of deliverables and education as consumption, challenges staid structures of power, and seeks to constructively question motives and goals at every opportunity. Technology and the open web have empowered us in this endeavor, leveling the playing field in ways that give those who might imagine other trajectories within the university the means to counteract power.
I could say much more about the work we’ve been doing, where it’s succeeded, where it’s failed, and how it’s been a struggle. But the point here has been to situate our work, to historicize it in a way that brings to the fore its politics. This is something that I think the progressive edtech movement has done quite clearly, but that the digital humanities have not.
In many ways, the digital humanities is not really new. Or, that is to say, the methods and questions and processes that constitute its core are not new. Just drawing upon my own disciplinary (and professional) past, the folks at the American Social History Project have been exploring the implications of new technologies on scholarship and pedagogy for nearly thirty years, challenging orthodoxies and valorizing collaboration and innovative approaches to engaging with the past since the Kaypro II. The Center for History and New Media was founded in 1994 and together these two organizations built the first large scale efforts to digitally reimagine the past in the classroom and beyond. Randy Bass’s work out of Georgetown — which I first encountered as an undergraduate participant in the “Crossroads Project” at the University of Michigan in the mid-90s — has done much to promote the use of digital tools to remake the classroom and curricula. Additional examples in “humanities computing” are many.
What is new about the digital humanities, though, is the legitimacy, funding, and visibility that it’s found over the past few years, and those are the components that have sparked recent efforts to set some boundaries and define the field. Frankly, this process has sometimes bordered on the absurd. The recurrent presence of phrases like “big tent,” “expansive,” and “broadly conceived” give speakers a rhetorical tool set for drawing just about any academic work done with technology into the field. It gives graduate students who use technology in their research a language for demarcating their work from those who do not. This slipperiness makes formulating a critique a significant challenge, since the digital humanities resists being reduced to a single or even a handful of things. In trying to write this I’ve had a difficult time boiling my critique down to an unhedged essence. But, here goes.
The (un)structure of the digital humanities has led to a careerism and opportunism that, to the outsider, often obfuscates the genuinely pathbreaking work that’s happening around the field. It’s here where I see the biggest point of difference between educational technology and the digital humanities. Edtech is necessarily implicated in constructing the university of the future, and one of the many reasons that battle is so important is that its outcome will in fact go a long way towards determining the future of the humanities. While there is significant political content within the digital humanities — the valuing of openness, the emphasis on sharing, the location within technology of particular tools and methods for empowerment — one gets the sense that ideology is not the main thing. In other disciplines (history and educational technology being the two I’m most familiar with) political debates abound, often times propelling ideas forward. In the digital humanities you tend to see much more agreement than disagreement. While it’s well and good to be agreeable, and I far prefer people who are, we are in high-stakes times. The humanities have been and continue to be in crisis. Budgets are burning, departments are being axed, and in many places the very value of a humanistic education is not only being questioned, but boldly denied.
And yet, a tone predominates in the discourse around the digital humanities that often seems to sidestep this crisis, or miss it altogether. Part of this is no doubt attributable to the fact the the digital humanities has become so dependent upon Twitter and is thus subject to the distorting echo of the hive mind. Part of it is also contributable to the new sense of community and connectedness within the field, which has also spurred a significant amount of navel-gazing and those efforts to self-define. I admittedly suffer from enthusiasthma, but the “I’m okay, you’re okay” “RT congrats!” cliquishness that flows across my screen and predominates at DH gatherings seem to me to be a bit misaligned with the current trajectory of the humanities in higher education. DH jobs, funding, and departments are becoming more widely available while the broader humanistic project — to which universities are central — crumbles around us. Are new tenure track positions, attempts at building a canon and establishing authority, and a dozen new conferences representative of progress, or are they reentrenching and reinscribing power along traditional paths? (Yes, I realize the answer can be “both.”) And why do digital humanists seem to celebrate scholarship much more deeply and publicly than teaching and learning? These questions are at the core of my discomfort with aligning my work with the digital humanities, as much as I’ve learned and benefited from scholars at its center.
Some might ask, “well, what about #alt-ac?” I appreciate the extent to which that phrase articulates, illuminates and validates the variety of labor paths and modes that make the university function and evolve (including what I do). Yet I can’t help but feel that something might be lost by, as Jim Groom has said, “naming and reifying my alterity.” Adapting for myself the pressure to publish, travel to conferences, keep up with the canon, to constantly produce and present new research — all of the things that seem necessary to establish one’s self within the digital humanities, even as an “alt-ac” person — doesn’t really seem “alt” at all. It’s seems about exactly what I expected from a career in academia.
I realize this argument is deeply personal, perspectival and located mostly within my own struggles to navigate professional terrain. I’m not trying to shit on anyone’s work. Some of my best friends are digital humanists, I swear. But I know that I’m not the only person to feel some of the things I’ve written above. At the end of my brief, wholly unpolished presentation to the CUNY DHI last week, @mkgold tweeted “@lwaltzer argues for a more muscular, progressive version of the Digital Humanities that questions/critiques power.” I initially wasn’t comfortable with that conclusion being drawn from what I had said because I don’t feel myself enough of a DH insider to make any arguments for what its future should hold. And yet upon more reflection I do feel nurturing that ethos is and must be central to the humanities. It’s simply too important to be absent from or even unclear in any future vision of the university.
I guess that, thanks to Matt and Charlie’s invite and the struggle to write this post that ensued I’ve learned that I’m interested in the digital humanities only to the extent to which it helps me use technology to do the work as a humanist I’d try to do even if we had no computers. So does that mean I’m in, or out?