Category Archives: Fall 2013

Posts done in Fall 2013

The Reverse Midterm

5430419092_479b38c222

Source: “Reverse If,” cc-licensed flickr photo by Sharon Hinchliffe

Given our discussion of DH pedagogy last class, I wanted to share with you a post published today by our colleague Joe Ugoretz, Associate Dean
of Teaching, Learning, and Technology at Macaulay Honors College. In “Reverse Midterm,” Joe describes a recent experiment in the classroom with his students:

I started the class by telling them that I realized that we did not have a midterm scheduled, but we still had to have one, so today was the midterm. [. . . .] but it would be my midterm. They write the questions, I have to answer. I told them they could grade me, too. (this led to some moments of real joy).

Head on over and read the whole post (published, I will note, on the CUNY Academic Commons).

Is this DH pedagogy? Digital pedagogy? Paper-and-pencil pedagogy? I’d posit that such labels matter less than the fact that it is inventive and thought-provoking pedagogy, which is what good teaching should be most concerned with.

What Does Digital Humanities Mean for Pedagogy?

I was so happy to read Stephen Brier’s essay in Debates in the Digital Humanities, which echoed many of my thoughts:

…this recent rush toward the technological new has tended to focus too narrowly, in my judgment, on the academic research and publication aspects of the digital humanities, in the process reinforcing disciplinary “silos” and traditional academic issues while also minimizing and often obscuring the larger implications of DH for how we teach in universities and colleges and how we prepare the next generation of graduate students for careers inside and outside of the academy.

For me, yesterday’s class discussion really got to the heart of my Digital Humanities questions. Using digital tools like text mining and data visualization are nice for higher-level research, but what about undergrads preparing to go out into the real world? Yes these things are used by people outside of the academy (see here and here ), but shouldn’t we be introducing students to a broader range of digital tools earlier on? When we incorporate technology in pedagogy, is it “Digital Humanities,” or just teaching and the internet? Does it matter?

As a Literature major, I was taught minimal digital skills in my undergrad courses. I took one required programming course that didn’t really stick, but I had a basic understanding of the internet and computers because I grew up in the 90s. While the degree helped of course, I was hired into my first desk job mainly because of the practical skills I picked up on my own. When I started working at a magazine, I picked up things that anyone in media needs—a basic understanding of HTML, how to work with content management systems, and how to be a project manager. My school didn’t offer courses in digital media, though I’m sure some students who have the foresight about what kind of job they want (that is to say, not most 18-year-olds), would be able to pick up similar skills in such classes at their own universities.

Many of my undergraduate teachers encouraged us to incorporate digital tools into our projects, I guess assuming that most kids these days were adept at programming and building websites. I remember only one student going this direction in a Creative Writing seminar – a friend of mine who learned to code on her own because she enjoyed it. On presentation day, instead of reading from a novella, she presented a video game, and moved the characters around the screen to act out her story. Jaws dropped. How could an English major use computers like that?

Anyway, that Literature student works for Google now.

As Stephen Ramsay says in his Programming with Humanists essay , “…if an English or a History student finds his or her way into a class on programming, it is not because of some perceived continuity between the study of Shakespeare or the French Revolution and the study of for-loops and conditionals. Most students see programming—and not without justice—as a mostly practical subject” (228).

It is to an undergraduate’s advantage, especially if he or she will not pursue graduate school, to have an active understanding of technology and its practical uses (even if they aren’t working for Google). I had a rockstar intern at my marketing agency who: knew some HTML and could help me with e-mail newsletters, could use Adobe Illustrator and InDesign to help me put together marketing brochures, and knew how to use Excel to make charts and do calculations (thankfully for me, because Zzzzz…). He will have a leg up from other people his age applying to the same jobs after graduation. I usually assume most humanities students learn practical job-related skills on their own in an internship, or when they get a Master’s degree. And while internships are a great place to learn on-the-job skills, many students in smaller communities don’t have this opportunity.

While undergrads in the humanities are learning excellent skills such as how to present an effective argument, they aren’t getting enough practical skills from humanities teachers that will make them competitive in the current job market. But outside of programming, business, science or digital media, how do you do this for students with broader interests who are unsure of what exactly they want to pursue for work? To further complicate things, because technology changes as quickly as it does, how can you ask teachers to keep up?

I think the whole area of pedagogy is where it becomes most important to define what we’re talking about when we talk about “DH.” Shouldn’t we be teaching students to use technology effectively in order for them to better interact with the modern world? The Looking for Whitman project  is a great example of how you can combine practical skills (collaboration, writing for an audience, using blogging software) with academic ones (thinking critically about texts, etc.). While writing an effective essay is important, it isn’t everything. This recent Slate article brings this point home.

For me, DH has wider implications for the university system because the people who are involved seem to be the most open to new ideas. Without trying to seem too idealistic, shouldn’t we be harnessing this power somehow to change the system and the way students learn, rather than just using  it in our own research?

As Brier says,

CUNY’s growing focus over the past two decades on the scholarship of teaching and learning has by no means been limited to the digital humanities, narrowly defined. If we are willing to broaden our definition of digital humanities beyond academic research and related issues of academic publication, peer review, and tenure and promotion to encompass critical questions about ways to improve teaching and learning, then CUNY’s various digital pedagogy projects and strategies offer an alternative pathway to broaden the impact of the digital humanities movement and make it more relevant to the ongoing and increasingly beleaguered educational mission of contemporary colleges and universities.

Educating the next generation of informed citizens ultimately falls on the shoulders of teachers. Now that technology is a part of that world, it should be a part of teaching as well. Because the impacts of technology are forcing things to move faster than say, the printing press, I don’t see academia catching on quickly enough on its own. But it’s also important to note that everyone is struggling to keep up, not just the academy.

Further Thoughts on Digital Pedagogy — How We Think

I found yesterday’s class discussion extremely productive. Particularly striking was the question posed regarding learning outcomes—I.e., if one is teaching writing, what should one expect his/her students to actually gain/know by the end of the term?

As Mark Sample notes, the undergraduate essay is somewhat superfluous. Therefore, I find that augmenting critical thinking abilities, broadly speaking, is the most effective learning outcome, particularly for undergrads whose paths are often uncertain. However, this raises a somewhat abstract question as to how we think, which may be where DH pedagogy comes into play. For instance, there may have once been a moment in history when the most effective means for producing critical thinking involved reading and writing, but perhaps that is no longer the case. In my extremely limited teaching experience I have found that writing skills seem to be decreasing and technological skills are exponentially increasing, and this is not an attribute of shifting pedagogical strategies but of mass culture in general and perhaps even radically different psychological modalities. Digital pedagogy seems to support this shift, whereas traditional pedagogy may be working against it.

Katherine Hayles’ most recent book How We Think: Digital Media and Contemporary Technogenesis highlights the distinction between hyper-reading and close-reading, and she calls for an amalgamation of the two. In other words, we should embrace the way we think and process information as a digital culture, but the ability to close-read is still valuable. This, I find, is the best approach to pedagogy in the digital age.

Admittedly, I am simply outlining my thoughts following yesterday’s discussion, and this is probably not an adequate account of what I feel is an important discussion in DH and certainly beyond. Do people, particularly those with more teaching experience, have any thoughts?

17th Century London

This post on Londonist about a group of students from De Montfort University creating a “fly through” of 17th century London, seemed very applicable to our class readings for today:

A group of students at De Montfort University created this fly-through of 17th century London…The model focuses on the area around Pudding Lane and the bakery of Thomas Farriner, where the Great Fire of 1666 started.

The students used maps from the British Library, tavern signs, and building details from Samuel Pepys diary to capture life on 17th century London Streets. The class blog documents their work.

Sustained disruption

William Turkel’s presentation and workshop last week opened with the notion that those who engage in physical computing have the opportunity to “build objects that convey a humanistic argument.” This reminded me that DH scholarship isn’t constrained to data and digitization. While access to digital information and artifacts plays a huge role in the genesis and momentum of the digital humanities, working with data can simply be seen as working with knowledge in the most popular medium of the day. The systems we work within have multiple entry points, and many possible layers to manipulate. Beyond software (and the industry and implications of big data), physical computing and fabrication offer us an alternative way to formulate questions about interfaces, manufacturing, and the politics of innovation.

Always when working with computers and digital tools, we confront not only the black box of processes that we don’t fully understand, but also the scholar’s entanglements with the prescriptions and rules of consumer technology. But a physical computing project works on a more fundamental level of abstraction. While, as Tukel pointed out, there is always a proprietary (non-transparent) layer involved, a physical computing project does allow its maker to experiment with and change a different set of parameters and functionalities than software allows. It’s important that we have permission and the resources to take a hands-on approach to computing because it can disrupt and deepen our relationship to the technologies that we’re ultimately accountable for when DH practice becomes critique.

But I got the feeling that Turkel wasn’t overly concerned with that kind of broad or absolute speculation. I’m interested in the fact that Turkel’s lecture and workshop didn’t necessarily move in the direction of solving what it means to work with hardware, sensors, or fabrication. His talk sidestepped making heavy-handed theoretical claims or predictive expansions on his opening thought, and instead moved into a discussion of his students’ individual projects. I’m not sure his lecture outline was a statement in itself, but it did seem that he refrained from making explicit claims about the need for or purpose of physical computing in answering DH objectives or critiques. Turkel seemed to be saying that while physical computing—as a medium for play and exploration—can represent ideas and embody cultural critique, the future of the humanities does not depend on our mastery or reinvention of microchips. Even though we are bound to make objects that convey arguments, Turkel’s pitch for the essentialness of making didn’t seem wholly contingent on the scientificness or theoretical stakes of our approach. Perhaps “making” outside one’s comfort zone is important in itself, and represents a commitment to the interdisciplinarity that we have presumably already embraced.

I’m interested in the simplicity and purity in the invitation to play and fail, and wonder about this as a sustainable framework for understanding new tools and asking new questions. It’s interesting that Turkel made a few references to his love of kindergarten, and the values we lose after we leave that early classroom environment—a place of beginnings, an environment that’s ideal for experimentation.

In a sense, humanities scholars who decide to delve into hardware hacking, software, and interface design are also engaged in a kind of beginning or frontier. This last week, I’ve found myself asking questions about the relationship of beginnings to experimentation and play, and wonder how Turkel’s hands-on imperative will evolve as the contexts, spaces, and available expertise for making technology grow and change.

Making and creating and not worrying about failure

William Turkel’s lecture “The Hands-on Imperative”, his subsequent talk to our DHpraxis class and then workshop had me thinking about a number of different things, specifically the issue of space, the issue of tenure portfolios that contain fabrication and physical experimentation projects, and how failure is something to embrace and not fear.

His discussion of space and the need for a place to make things, to play and create resonated with me.  In a blog post “A Few Arguments for Humanistic Fabrication” on his blog “Digital History Hacks 2008”  Turkel said

“The limitations of our physical spaces can be more difficult to circumvent. Most of the teaching and research environments available to humanists at my university are designed to support solitary or small-group office work. These spaces are almost comically unsuitable for the kinds of things I try to do with my students: soldering, mold-making and casting, building and lighting physical exhibits, programming in groups, creating displays or signage.”

One really has to always be aware of space and how and why a space is being used.
As a librarian I’ve always been aware and concerned with the physical space and layout of a library and how the space affects people’s use of the library.  If the space is not inviting, if it doesn’t match what people are using a space for it can be a real hindrance, stifling creation and education. The fact that humanities spaces in academia are not set up for play and creation of physical objects does not surprise me at all.  We need to be able to break out of the confines we find ourselves in but many times in academic or corporate spaces we are not allowed to break out.  Trying to convince administrations to permit a space in an academic department where you can vent fumes, use power tools etc. is not going to be easy, I mean we aren’t even allowed to pick a different wall color to paint our student lounge. If we cannot even personalize the color of the walls of our own student lounge, how can we expect to create a space where we can have “high ceilings, natural light, plenty of ventilation, cement flooring, workbenches on casters, locking cabinets, big blank walls where you can hand things on. No carpeting, no beige cubicles, no coffee tables with plants.”

Another hot topic in DH, which we have been reading and talking about in class, which Turkel’s talk and workshop had me thinking about were issues of tenure and how DH building projects and work relate and are counted towards an academic’s tenure portfolio. In Stephen Ramsay and Geoffrey Rockwell’s chapter “Developing Things: Notes toward an Epistemology of Building in the Digital Humanities” in the book Debates in the Digital Humanities the authors ask “how do we”  and “can we” count the work of builders, hackers, coders as scholarship?  How is work on and about XML, XSLT, GIS, R, CSS and C counted and evaluated?  Is it scholarship? How will it be evaluated and can it lead to promotion and tenure?  Lev Manovich believes “a prototype is a theory.” Stan Rueker and Alan Galey say that “the creation of an experimental digital prototype [should] be understood as conveying an argument about designing interfaces” and that digital artifacts themselves, not just their surrogate project reports should stand as peer-reviewable forms of research, worthy of professional credit and contestable as forms of argument.  “It is the prototype that makes the thesis, not discursive accompaniments like white papers, reports and peer-reviewed papers.”  My question is – are these beliefs truly occurring in practice in academia today?  Can a faculty member truly include the objects they make using Max 6, Phidgets, Arduino or Makey Makey in their tenure portfolio and have it count in a meaningful way towards tenure?  Are academic departments and institutions willing to accept this type of work as scholarship and worthwhile of tenure?  And if not then what does that mean?  Should we stop making things or should we continue to make things even if they are not counted? How can we work to ensure that this type of work is considered scholarship?

Finally, Turkel talked about failure and what can be learned from failure.  He talked about how some of the best students in his class are those who have no training in programming or shop classes and therefore have no preconceived notions of what can and cannot be done and are not afraid to fail.  At various library jobs I had, where I had staff, I would tell them not to be afraid of making a mistake when working on the library catalog.  I would tell them to be inquisitive and to explore the program and to ask questions.  I assured them I set it up so that they couldn’t destroy the catalog and that their exploring and using the system was how they and I would learn.  I try to follow this philosophy myself when setting up database interfaces and catalog systems. However, it is not always easy.  Fear of failure and the consequences of that failure on job security (and sometimes grades) are real fears.  I think it is great that Turkel is able to assure his students that they will not fail his class if their projects fail but in many instances, in many jobs, this is not a promise one is given.  I always joke that the only job where you can be wrong all the time and fail and not get fired is weather person.  I say it jokingly but it is somewhat true.  In academia or in corporate culture, having a project fail is not always looked upon in a positive light.  As the people feeling the heat from the Federal Government Affordable Care Act Marketplace web site can attest to, people do not seem to think the current problems are “learning experiences.”  How do we then promote inquisitiveness, willingness to take chances and possibly fail in the projects we work on in DH without the fear of the consequences of our failure?

“I’ve missed more than 9000 shots in my career. I’ve lost almost 300 games. 26 times, I’ve been trusted to take the game winning shot and missed. I’ve failed over and over and over again in my life. And that is why I succeed.” — Said by Michael Jordan in a Nike ad, written by Jamie Barrett.  http://youtu.be/GuXZFQKKF7A

 

 

10/14/13: Matthew Kirschenbaum, “The Literary History of Word Processing”

 

Matthew Kirschenbaum spoke about his forthcoming book project, which was recently profiled in The New York Times.

Kirschenbaum’s research asks questions such as: When did writers begin using word processors? Who were the early adopters? How did the technology change their relationship to their craft? Was the computer just a better typewriter—faster, easier to use—or was it something more? And what will be the fate of today’s “manuscripts,” which take the form of electronic files in folders on hard drives, instead of papers in hard copy? This talk, drawn from the speaker’s forthcoming book on the subject, will provide some answers, and also address questions related to the challenges of conducting research at the intersection of literary and technological history.

Matthew G. Kirschenbaum is Associate Professor in the Department of English at the University of Maryland and Associate Director of the Maryland Institute for Technology in the Humanities (MITH, an applied thinktank for the digital humanities).

Know Your Typewriter History

Film still from "Know Your Typewriter" courtesy of Prelinger Archives and archive.org

Film still from “Know Your Typewriter” courtesy of Prelinger Archives and the Internet Archive

In “Gibson’s Typewriter,” Scott Bukatman writes about the irony that William Gibson’s cyberpunk novel Neuromancer was composed on a manual typewriter. Distinguishing himself from the postmodernists who have declared the end of history, Bukatman argues that “The discourse around surrounding (and containing) electronic technology is somewhat surprisingly prefigured by the earlier technodiscourse of the machine age.” To explore the “tropes that tie cyberculture to its historical forebears,” Bukatman says that he wants to reinstate the history of the typewriter “in order to type history back into Neuromancer.”

Typing history back into Neuromancer turns out to be quite a challenge because as Bukatman says, “The repression of the typewriter’s historical significance in the Neuromancer anecdote has its analogue in the annals of technological history. No serious academic investigation of the typewriter has been published, to my knowledge, and almost all curious writers seem to rely upon the same two texts: The Typewriter and the Men Who Made It (hmmm . . .) and, even better, The Wonderful Writing Machine (wow!), both highly positivist texts from the 1950s.”

I started out interested and sympathetic to Bukatman’s aims. He is a gifted writer who skillfully pulls apart and teases out the meaning of the 1950s texts. I had the impression that I was reading the “truest” version of the history of the typewriter that was available at the time Bukatman was writing. I was curious, though, about other histories of the typewriter that might have been published after this piece was written.

After some research I was surprised to learn that there are quite a few histories of the typewriter, almost all of which were published well before Bukatman’s essay. See the Smithsonian’s bibliography of the typewriter and Google Books (related books links). A number of these were written by collectors or have illustrations targeted to collectors; but several are more serious, with Michael H. Adler’s The Writing Machine widely regarded as the most accurate. Despite Bukatman’s claim at the time of his writing that there weren’t academic books about the history of the typewriter, one of the two histories he cites, The Typewriter and the Men Who Made It, was written by a professor at Urbana, published by the University of Illinois Press, and reviewed in a journal of the Organization of American Historians. Another example from academia is George Nichols Engler’s dissertation, The Typewriter Industry: The Impact of a Significant Technological Revolution (1969).

Am I simply being pedantic by pointing this out?  I don’t think so. Bukatman, Professor of Art and Art History at Stanford, declares that his task is “reinstating history,” and the recitation of that history comprises about a third of the essay. Calling the lack of authoritative histories a “repression” and claiming an analogy to the “repression” of the anecdote about Gibson’s manual typewriter in cyberculture is central to the structure of his argument. And, through his fluent analysis of the texts he has chosen, he seems to present himself as an authority who has culled the best that is available.

The feeling that I am left with as a reader is of being misled by the writer (however inadvertently), and that, to borrow Bukatman’s phrase, the “disappearance [of history] was little more than a trope of a postmodern text.”