Author Archives: Sarah Welsh

[Cross-Posting] On Successful #DH Project Management

Project management is difficult. As one of my teammates said to me point-blank: “I would not want your job.”

As our team began to work on Travelogue, I assumed that my brief stint organizing the development of two separate websites in various professional settings would help me. But while a background in marketing has allowed me to think more critically about things like publicity, nothing really prepared me for managing people my own age in a setting where we do not receive salaries for our work.

And while I have been extremely lucky to work with a group of brilliant people who are  invested in helping me complete the project, it has been tricky figuring out how to tell people what (and how much) to do; everyone has full lives outside of school.

In a work setting, orders would coming down from my boss who had little idea of the actual tasks we needed to take in order to complete a website. The details of these orders were laid out for me by advanced IT and design departments, each of whom had their own ideas about how the website should look and behave. In this project, where I am the “boss,” things were more difficult, especially because while all of us have great ideas, the actual means to execution can be unclear. But just because you only have a basic understanding of web design, it does not mean that you can’t build something (mostly) from scratch. You just need a good plan.

Websites and website redesigns can (and do) take years to complete, but for this project, we only have about four months. In the course of this semester thus far, I’ve found that a few things are essential to completing a project successfully. Some seem obvious, but when you are trying to keep a bunch of different wheels spinning, simple things can be easy to forget.

(Of course, this is not complete list)

Know Your Deliverables

What are the major tasks that need to be completed in order to produce a final project? In the course of a semester, what needs to be completed from week-to-week in order to get things done? Setting some key deadlines, and being able to adjust them, will help the project move forward. I made a simple project plan in an Excel document that was arranged by week, with a new goal for each Monday. From there, I doubled back and talked to my group members about what needed to be completed for each goal. I am indebted to Micki Kaufman for major assistance here, as well as to Tom Scheinfeldt’s lecture last semester.

Use Your Support Network

There are experts at your school who can help you. As it goes with everything, being afraid to ask for help can (and will) diminish your success.

Know Your Team’s Strengths (and Weaknesses)

Project management involves a good deal of emotional intelligence. Knowing where your group members are coming from, and being aware of and sensitive to what they can and can’t accomplish in a given time frame, will provide for a better outcome. It kind of goes without saying that actively listening to your group members’ concerns and ideas will make them more invested in your goals.

Be Flexible

This goes for allowing extra time in your project plan, as well as being open to adjusting your vision and/or timeline. It can be hard to let go of original ideas, but if they aren’t working, it’s important that you are able to recognize that and just let go. In the case of Travelogue, our project scope changed slightly from what I originally proposed when we learned more about our platform. You also have to pad enough extra time in your project plan in case you hit roadblocks or an unexpected learning curve.

Relax (a Little Bit)

In working on a major project with a tight deadline, not only is it important to manage your expectations, but it is also important not to put too much pressure on your group. My personality defaults to surface-level relaxation that can be misinterpreted as lackadaisical, when usually (like anyone else) I’m managing a huge amount of internal stress. I try not to micromanage my team as a result of my internal freakouts, which would make anyone stressed-out and disengaged. At the same time, being too lax about deadlines says: “I don’t really care.” If you don’t care, neither will they.

We are currently buzzing around our computers to get this thing done, with constant revision of the plan to keep things in motion.

Visit: https://travelogue.commons.gc.cuny.edu

And here is a link to the project plan for anyone who’s interested: https://docs.google.com/spreadsheet/ccc?key=0As13_khVZTLXdHBMV2NlNWwtTndiRTZsUk1QQTVWYnc&usp=sharing

What About Alt-Cubicle?

The question posed to MA students towards the end of class on the PhD/alt-ac debate was something like: “How does this make you feel?”

I would say “optimistic, but cautious.” But because my goal is ultimately a PhD, and because the job market is tepid at best, why am I optimistic about it at all? Why is anyone? I know that I want to teach, and I don’t have faith in the job market, yet I still convince myself that it is an option that will be open to me if I work hard enough. I think a lot of people who want to go into teaching are also optimists, and also want to help better the world and contribute knowledge and such. If there are other things we can do that would satisfy the same desires, shouldn’t we know about them?

A thread I found running through our conversation and the articles I’ve read about career identity in general seem to point to a common problem: When we start a career, how do we know what we’re really getting ourselves into?

My mom has worked as an attorney at the same company for the past 30 years. She thought that the current overflow of law school students would be solved if they knew what being a lawyer actually meant. Maybe the same problem could be fixed if a requirement for entrance into PhD programs was to follow professors around and see what it’s actually like to be a faculty member  (considering that you want to be one).

So why are we not given more information about potential careers before we embark on them (outside of “take your daughter to work day”)? Aside from dire warnings about the job market, I think some practical advice is in order before anyone continues down a rocky path.

This article  may be one example of why we don’t get more in-depth information about careers. Any honest transparency could mean that you lose your job or a reference. It’s the same reason people aren’t honest in exit interviews.

Something else about the alt-ac debate that I find disconcerting is this idea that people with PhDs in the humanities can fall right into publishing – a field that is often grossly underpaid and quickly losing momentum.

I think a lot of people from my generation look at their parents successful careers, pension plans and slight unhappiness and ask themselves what they can do differently while still finding success. For some, this is a constant exploration that may lead somewhere like graduate school. And because a lot of us are wholly unsatisfied sitting in a cube working at a boring job all day, maybe we just need to be fully aware of our options and what it takes to get someplace else.

What Does Digital Humanities Mean for Pedagogy?

I was so happy to read Stephen Brier’s essay in Debates in the Digital Humanities, which echoed many of my thoughts:

…this recent rush toward the technological new has tended to focus too narrowly, in my judgment, on the academic research and publication aspects of the digital humanities, in the process reinforcing disciplinary “silos” and traditional academic issues while also minimizing and often obscuring the larger implications of DH for how we teach in universities and colleges and how we prepare the next generation of graduate students for careers inside and outside of the academy.

For me, yesterday’s class discussion really got to the heart of my Digital Humanities questions. Using digital tools like text mining and data visualization are nice for higher-level research, but what about undergrads preparing to go out into the real world? Yes these things are used by people outside of the academy (see here and here ), but shouldn’t we be introducing students to a broader range of digital tools earlier on? When we incorporate technology in pedagogy, is it “Digital Humanities,” or just teaching and the internet? Does it matter?

As a Literature major, I was taught minimal digital skills in my undergrad courses. I took one required programming course that didn’t really stick, but I had a basic understanding of the internet and computers because I grew up in the 90s. While the degree helped of course, I was hired into my first desk job mainly because of the practical skills I picked up on my own. When I started working at a magazine, I picked up things that anyone in media needs—a basic understanding of HTML, how to work with content management systems, and how to be a project manager. My school didn’t offer courses in digital media, though I’m sure some students who have the foresight about what kind of job they want (that is to say, not most 18-year-olds), would be able to pick up similar skills in such classes at their own universities.

Many of my undergraduate teachers encouraged us to incorporate digital tools into our projects, I guess assuming that most kids these days were adept at programming and building websites. I remember only one student going this direction in a Creative Writing seminar – a friend of mine who learned to code on her own because she enjoyed it. On presentation day, instead of reading from a novella, she presented a video game, and moved the characters around the screen to act out her story. Jaws dropped. How could an English major use computers like that?

Anyway, that Literature student works for Google now.

As Stephen Ramsay says in his Programming with Humanists essay , “…if an English or a History student finds his or her way into a class on programming, it is not because of some perceived continuity between the study of Shakespeare or the French Revolution and the study of for-loops and conditionals. Most students see programming—and not without justice—as a mostly practical subject” (228).

It is to an undergraduate’s advantage, especially if he or she will not pursue graduate school, to have an active understanding of technology and its practical uses (even if they aren’t working for Google). I had a rockstar intern at my marketing agency who: knew some HTML and could help me with e-mail newsletters, could use Adobe Illustrator and InDesign to help me put together marketing brochures, and knew how to use Excel to make charts and do calculations (thankfully for me, because Zzzzz…). He will have a leg up from other people his age applying to the same jobs after graduation. I usually assume most humanities students learn practical job-related skills on their own in an internship, or when they get a Master’s degree. And while internships are a great place to learn on-the-job skills, many students in smaller communities don’t have this opportunity.

While undergrads in the humanities are learning excellent skills such as how to present an effective argument, they aren’t getting enough practical skills from humanities teachers that will make them competitive in the current job market. But outside of programming, business, science or digital media, how do you do this for students with broader interests who are unsure of what exactly they want to pursue for work? To further complicate things, because technology changes as quickly as it does, how can you ask teachers to keep up?

I think the whole area of pedagogy is where it becomes most important to define what we’re talking about when we talk about “DH.” Shouldn’t we be teaching students to use technology effectively in order for them to better interact with the modern world? The Looking for Whitman project  is a great example of how you can combine practical skills (collaboration, writing for an audience, using blogging software) with academic ones (thinking critically about texts, etc.). While writing an effective essay is important, it isn’t everything. This recent Slate article brings this point home.

For me, DH has wider implications for the university system because the people who are involved seem to be the most open to new ideas. Without trying to seem too idealistic, shouldn’t we be harnessing this power somehow to change the system and the way students learn, rather than just using  it in our own research?

As Brier says,

CUNY’s growing focus over the past two decades on the scholarship of teaching and learning has by no means been limited to the digital humanities, narrowly defined. If we are willing to broaden our definition of digital humanities beyond academic research and related issues of academic publication, peer review, and tenure and promotion to encompass critical questions about ways to improve teaching and learning, then CUNY’s various digital pedagogy projects and strategies offer an alternative pathway to broaden the impact of the digital humanities movement and make it more relevant to the ongoing and increasingly beleaguered educational mission of contemporary colleges and universities.

Educating the next generation of informed citizens ultimately falls on the shoulders of teachers. Now that technology is a part of that world, it should be a part of teaching as well. Because the impacts of technology are forcing things to move faster than say, the printing press, I don’t see academia catching on quickly enough on its own. But it’s also important to note that everyone is struggling to keep up, not just the academy.

The Science/Humanities Gap

A few of the DefiningDH blogs have touched on the disparity between/problem of digital research methods in the sciences and humanities, and how humanists can use technology in their work. Here is a recent NY Times article I stumbled across on this:

http://opinionator.blogs.nytimes.com/2013/09/18/sciences-humanities-gap/?_r=0

Without mentioning Digital Humanities per se, the author (who is responding to another interesting article about how humanists MUST embrace the sciences) believes humanists are well aware of this gap:

Pinker notes the antiscientific tendencies of what he calls “the disaster of postmodernism, with its defiant obscurantism, dogmatic relativism, and suffocating political correctness.” But literary studies, the bastion of these tendencies, have long been moving in other directions, including a strong trend toward applying scientific ideas and methods. There is, for example, the evolutionary and neurological study of literature and, most recently, the use of computer data-mining.

There is, then good reason to think that the greater problem is scientists’ failure to attend to what’s going on in the humanities.

In the readings this week, Lev Manovich poses a similar problem in relation to data access and interpretation:

I have no doubt that eventually we will see many more humanities and social science researchers who will be equally as good at implementing the latest data analysis algorithms themselves, without relying on computer scientists, as they are at formulating abstract theoretical arguments. However, this requires a big change in how students in humanities are being educated.

Manovich leaves this question open-ended, and it’s a big one. Both authors seem to be bothered by disciplinary narrowness and a lack of cooperation across disciplines.

I don’t know about anyone else, but part of the reason I was attracted to Digital Humanities was the fact that many of my research and teaching questions can’t be answered by taking more Literature classes.

Defining the Digital Humanities

 

DH before class: Using technology to study traditionally non-technologically inclined fields in new ways.

DH post-discussion: Using technology to shape learning, teaching and research.

Because academia often exists in a big bubble, it seems like DH scholars should be experts in keeping up with technology (as much as they can) as it pertains to research, teaching and learning. There are tools that exist that people use… that academics should know about and should also be able to use.

I guess I’m still a little stuck on the word “humanities.” Is Digital Humanities about digitization and technology in academia in general, or just within the humanities? If you don’t confine it to a field, is it just about technology? It will be interesting to explore parameters. I also keep coming back to what was mentioned in class, about how people in the physical sciences think this debate is silly–because of course you should be collaborating and using technology in your research.

I’m interested in writing,  literature, and publishing, so the scope of this definition is pretty huge here.  If you think about the future of books, for example, which is obviously an important question for universities (or it should be), the purpose is further complicated (or, has room for expansion). Is the future of publishing something that should be addressed by publishers, or should Digital Humanists be doing this research? Are DHers using the tools, developing the tools or both? So is it just about research and teaching or about technology in the world in general?  After the end of our discussion, the definition became more broad, more overwhelming and even more fascinating. I’m hoping DH scholars are aiming to pop the academic bubble.