Skip to main content

Research Proposal: Profile of digital scholarship activities at York University

Published onOct 14, 2019
Research Proposal: Profile of digital scholarship activities at York University

Research Problem

The goal of this study is to create a holistic understanding of digital scholarship practices and needs at York University, which will serve the York University Libraries’ strategic goal of supporting and fostering digital scholarship practices.

Digital scholarship intersects with all five of the research opportunities outlined in York University’s Strategic Research Plan.[1] The definition of “digital scholarship” will be blinded in this study, allowing researchers and educators to self-identify teaching- and research-related activities that have digital or technological components.

Though this work could be framed as program planning instead of research, its design as a research study enables several key outcomes:

  1. Documenting strengths-based approaches to needs assessment in a library and information science (LIS) context, which seldom appears literature

  2. Informing a programmatic approach to the development of digital scholarship supports at a large Canadian university, whose cross-disciplinary and experiential learning practices are aligned along six interrelated themes[1]

  3. Augmenting a lack of studies of digital humanities and digital scholarship supports in Canadian post-secondary institutions

This research makes a valuable contribution to the literature as one of only two known case studies for the planning and development of a digital scholarship centre in Canada.

Literature Review

History and definitions of digital scholarship

The term “digital scholarship” begins to appear in scholarly literature as early as 1999[2], but its initial scope is narrow. Its earliest representations appear in the domains of higher education and librarianship, referring to digital scholarship as the digital output of scholars. At the turn of the millenium, as web technologies were becoming dominant means of communication, concerns were being raised about the preservation of digital materials — whether they were surrogates for existing, paper-based scholarship or “born-digital” publications that might include multimedia content in addition to electronic text.[3]

Since about 2009, the use of the term has become more widespread; its scope, more encompassing. A systematic review of scholarly literature on digital scholarship, conducted in 2016, found discourse on the topic connected to three distinct domains: networked scholarship, digital libraries, and digital humanities. The most commonly-cited sources across their review were both in the field of education: Boyer’s Scholarship Reconsidered: Priorities of the Professoriate[4] (published in 1990, but absent of the term “digital scholarship”) and Borgman’s Scholarship in the Digital Age, released in 2007.[5]

Early, more narrow definitions of “digital scholarship” appear to have collided in recent years with the development and definition of “digital humanities;” in fact, one author has recently and indirectly suggested “digital scholarship” as the result of forging “digital libraries” and “digital humanities” together.[6] The evolution of the term “digital humanities”—which has itself been influenced by larger conversations about education in the information society—has had significant influence on “digital scholarship.” By the late 1990s, computing technology had been used as part of linguistic and textual studies for decades. Names for these practices included terms like linguistic computing, textual computing, or humanities computing, with the last of these terms eventually emerging as dominant.[7] Debates about the scope and nature of humanities computing—including whether or not the field had advanced past the creation of experimental methods and become an academic discipline in its own right[8]—resulted in one of the earliest, formal uses of “digital humanities” as an academic term: the University of Virginia’s creation of a Master’s Degree in Digital Humanities in 2001.[9] As rationale for the program, John Unsworth stated:

The basic impetus for this degree program is the simple observation that our culture and our cultural heritage are migrating very rapidly to digital forms, and in order to manage that migration and take advantage of the new intellectual and creative possibilities it offers, we will need trained professionals who understand both the humanities and information technology, and we will need them in a number of different areas—in museums, libraries, teaching, scholarship, publishing, government, communications, entertainment, and elsewhere.[10]

Here we see a definition of “digital humanities” that begins to encompass the fields of librarianship, education, and the processes of scholarship in general.

2004’s Companion to the Digital Humanities—co-edited by Unsworth—further codified the digital humanities field with chapters that outlined its application to disciplines ranging from archaeology and history to literary studies and the performing arts. The book covered a wide range of applications of digital technology to humanities research and noted that its methods and tools were as applicable to research and analysis as they were to knowledge dissemination and preservation.[11] Within a few years, self-identified digital humanists had developed multiple manifestos outlining the discipline’s values including openness, co-creation, critical knowledge production, equity and inclusivity, iterative scholarship, and process over product.[12] By 2012, the scope of the digital humanities had been generalized as encompassing practices of digitization, computation and analysis, modeling, classification and description, curation, analysis, editing, and communication.[13] At this time, scholars had already pointed out that the practices associated with digital humanities had begun to touch every aspect of scholarship,[14] including tenure and promotion;[15] contemporaneously, Johanna Drucker noted that libraries—commonly viewed as the core of universities—were the natural home for “digital scholarship” in the post-print era of research, learning, and communication.[16]

In first two decades of the third millenium, then, we see the gradual shift from computational methods for text analysis (linguistic computing), through the establishment of a field dedicated to understanding the effects and possible applications of those methods (humanities computing), to the use of digital tools and technologies as part of every aspect of humanities and social sciences research and teaching (digital humanities), and ultimately to their application in every aspect of scholarship (digital scholarship). This lexical transformation merges the current, broad interpretation of “digital scholarship” with the more narrow, artifact-centred definition that arose in 1999.

In parallel to the ongoing and occasionally-overlapping discourse on digital humanities, which still focuses on humanities and social sciences, recent conversations on the specific topic of digital scholarship dwell in the domain of librarianship and information studies, with some spill-over into the more general realm of higher education.[17] One of the earliest established centres for digital scholarship (originally tailored to the humanities and social sciences in 2006, and now encompassing scholarship in general alongside an established digital humanities practice) is the University of Virginia Library’s Scholars’ Lab.[18] Its creation marked a shift from the creation of digital humanities centres or institutes administered by humanities and social sciences faculties to the creation of digital scholarship centres that have almost always been located in (or intimately partnered with) academic libraries, and run by dedicated staff.[19]

Despite the now-frequent colocation of digital scholarship practices and libraries, definitions of “digital scholarship” have remained broad. Most agree on the centrality of digital methods and tools, but disagree on scope. Some definitions limit the field to the inquiry[20] and/or the dissemination of research,[21] and others emphasize its impact on scholarship’s non-linear discovery, use, and reuse.[22] Few definitions explicitly mention teaching practices, but this research proposal interprets other definitions’ choice of “sharing”, “use”, and “dissemination” to include pedagogical work done both inside and outside formal instructional environments. Our working definition follows Mulligan, who expanded on an earlier definition by Rumsey:[23]

Digital scholarship is the “use of digital evidence and method, digital authoring, digital publishing... and digital use and reuse of scholarship” including research and publication including "print and web-based text, video, audio, still images, annotation, and new modes of multi-threaded, nonlinear discourse.[24]

Finally, an important finding related to definitions has appeared in the literature, and will have an impact on this study’s design. In 1999, Rockwell[25] noted that the term “multimedia” was preferred to “humanities computing” in a curriculum proposal because the term “is meaningless to people outside its traditions and the program was unlikely to approved with such an awkward name.” In an echo of this sentiment, other studies have noted that their use of the term “digital scholarship” during needs assessment yielded poor results because respondents did not see themselves or their work—even if it was heavily influenced by digital tools and methods—in that light.[26] Accordingly, this study will avoid using the term wherever possible, allowing students and researchers to self-identify the practices they view in a digital context. This follows the view of digital scholarship advanced by Askey in 2014: “Eventually, within a generation, we will just call it research again.”[27]

A review of recent LIS and education literature reveals several projects related to the design and implementation of digital scholarship and digital humanities centres, including summarizing reports from the Association of Research Libraries (ARL),[22] the EDUCAUSE Center for Analysis and Research (ECAR),[28] and the Coalition for Networked Information (CNI).[29] Publications related to each of these projects detail trends in digital scholarship centres (DSC), lessons learned from DSC implementations, and case studies or reports on needs assessment activity. Though it is tempting to generalize from the themes and findings in literature, several reports note that every institutional context is different and must be uniquely addressed.[30]

Rather than generalizing an approach to the operation of a digital scholarship centre, this literature review will summarize some general themes that appear. Five topics span the literature: sustainability, communities of practice, curriculum connections, a student focus, and the importance of training and consultation work. Each of these themes will be addressed briefly in turn.


The topic of sustainability has been addressed by researchers and library staff in multiple needs assessment reports. The University of Calgary found that researchers would prefer a programmatic approach to their research, including plans for long-term preservation of research artifacts.[31] The desire for seed funding, such as pilot project grants, to kickstart digital scholarship is noted,[32] but other reports caution that a mix of project-based and institutional funding or consortial support is required to keep a digital scholarship practice sustainable.[33] Staff reluctance to adopt new services or skills related to digital scholarship appears in multiple needs assessment reports, with a lack of technology (hardware and software), a lack of staff capacity, and an unwillingness to commit to resource-intensive projects noted in findings.[34] Mackenzie advocates an iterative approach to building out digital scholarship supports as space, staff, and financial resources are gathered.[35]

Communities of practice

The spirit of collaboration, and the resources required to foster that spirit, is another common theme in existing literature. Researchers report that they would like access to collaborators, but generally don’t know where to find them;[36] that interest in new tools and methods is much stronger than their actual use;[37] and that the availability of collaborative space is critical.[38] In contrast, some existing centres have found that use of dedicated collaboration space is low, or that dedicated “office hours” for collaboration and consultation were not being used.[38] This suggests that either other factors are affecting the ability of researchers to use these resources, or that the implementation of spaces and collaborative opportunities doesn’t align with researcher workflows.

Curriculum connections

Two important findings are of note in relation to how digital scholarship work connects to curriculum. First, faculty members are interested in using digital tools and methods as an integral part of their teaching work, linked to learning outcomes.[39] Second, there is a general recognition that not all of a students’ learning needs are being met in the classroom, and that supplemental skills training and experiential work is required.[40]

Student focus

Connected with the previous theme, it is generally agreed that investments in students—typically at the graduate level, but not exclusively—lead to better success in digital scholarship practices. Nowviskie[41] highlights the University of Virginia’s focus in this area, noting that today’s graduate students are tomorrow’s faculty, and students are deeply connected (through project work, research assistant assignments, and other mechanisms) to the execution of ongoing research. Graduate students typically express a stronger interest in digital tools than faculty members, and there is an acknowledgement that hands-on experience with digital tools and methods will serve them in their careers.[42] These findings echo a trend that is already appearing in academic libraries: they are shifting towards more student-focused activity, viewing students as partners rather than patrons.[43]

Training and consultation

Almost universally, digital scholarship centres offer skills training, workshops, and other opportunities to learn about digital research methods. Needs assessments highlight this demand, with the University of Calgary’s project revealing a strong interest in mentorship opportunities, community-driven, peer-based learning, and speakers’ series or seminars.[31] In line with the demand for hands-on experience, one study showed that students were interested in learning about library holdings (primary sources, data. etc.) and tools or methods for working with that content, whereas faculty members were more interested in the content itself.[44] Several papers note that skills and methods training is often eclipsed by other commitments, the lack of time or funding, or the absence of appropriate infrastructure for providing instruction.[45] Finally, Henley & Bell[46] remind us that centres benefit by bringing novices and expert practitioners together in the same space, and that library staff are also interested in learning digital tools and methods.

Digital scholarship needs assessments

As recently as 2017, reports were pointing out that few institutions have done full needs assessments for digital humanities or digital scholarship centres.[47] It has been far more common (and it is recommended) to gather information by speaking peer institutions, examining their programs and resources, and connecting with their centres’ key stakeholders.[48] Accordingly, environmental scans almost universally form the foundation of digital scholarship centre planning and development. These scans are typically done with two scopes: a survey of existing service offerings within the institution under study, and a survey of offerings from peer institutions.[49]

A number of formal needs assessment activities have been documented—related to digital scholarship centres or more specific digital humanities centres—and they represent a wide range of research methods. Survey, focus group, and semi-structured interview techniques appear most frequently in the literature, and are often used in combination.[50] Some studies have supplemented formal approaches with data collected through more informal means: for example, instructors who conduct existing digital tools and methods workshops can speak to attendees about the work they’re doing and the results they’re trying to achieve.[51] There are some cautions against relying on general-purpose surveys because the results typically do not identify who is doing digital scholarship or digital humanities work,[28] the instruments are difficult to design well, and they do not allow for follow-up questions.[52] Focus groups and interviews each have advantages and disadvantages. Groups allow for building and prioritizing ideas, but may not have time to cover all topics and can be dominated by individual participants. Interviews permit more focused discussions and are an excellent source of contextual information, but are time-consuming and results from different participants may contradict one another.[52] All of these reasons justify the use of mixed-method approaches for needs assessment.

For the most part, the existing literature on digital scholarship needs assessment activity highlights the value of speaking with graduate students, faculty members, and other researchers: in other words, the communities that are actively engaged in work with digital tools and methods. Library staff are occasionally included,[53] and one project took pains to conduct focus groups with research support officers, deans, and undergraduate students.[54] Participant pools can be built by collecting information on grant winners, from faculty or departmental websites, from library staff who work with researchers, a canvas of recent publications from the institution, or by consulting research support staff.[28] Interview participants may also provide additional leads during data collection, allowing supplementary recruitment using the snowball method.[55]

Some general advice regarding the planning and development of digital scholarship centres is worth noting. Rather than trying to serve every need, centres should create networks of collaborators who share interests.[46] Libraries can play a role by advocating full lifecycle planning and management of digitally-enabled projects,[56] and centres will find value in partnering with existing units that have specialized expertise.[57]

In line with that notion, ECAR[47] recommends an assessment of existing areas of strength, with the goal of “leaning in” to activities that are already performing well. Tracking contacts made during the needs assessment will facilitate

Needs assessments and academic libraries

Watkins, Leigh, Platt & Kaufman[58] discuss three primary factors that differentiate needs assessment approaches:

  1. Means vs ends, where assessments are used to either recommend activities (methods-based) or determine desired outcomes (ends-based). The authors suggest that ends-based approaches are more useful;[59]

  2. Level or scope, which examines “bottom line”-based approaches to needs assessment (operational needs assessment), rather than holistic approaches that include societal or community outcomes (strategic needs assessment) (Watkins, West Meiers, & Visser, 2012); and

  3. Variations in technique, including the selection of assessment tools. Here, the authors assert that methods are “best selected only after the purposes and primary targets of a needs assessment are known and justified” (p. 42).

Most approaches to needs assessment are based on a deficit model, where some missing element is discovered based on what actions or conditions are absent form an organization (Hannum, 2013; Watkins et al., 1998, p. 41). The appreciative inquiry approach (Cooperrider & Whitney, 1999), derived from the action research model, up-ends this tactic with a strengths-based focus where existing, successful conditions are used as the basis for understanding what may be possible. Blended techniques that combine both approaches are advocated(Watkins et al., 2012, p. 62).

[to be added: there is some evidence that libraries almost always use a deficit model for needs assessment, and generally undertake “needs assessment” as a well-defined, concrete method, rather than one informed by theory and capable of a broader range of techniques (such as a strengths-based approach]

Research Questions

Working from an action research framework, and informed by strengths-based approaches, the proposed study will explore the following research questions:

  1. How do York University faculty members and graduate students use research, teaching and scholarly communications practices that incorporate digital data, tools, or methods?

  2. What current supports for digital scholarship exist for the benefit of York University students, staff, and faculty, and what is their scope (for example, are these supports accessible to all, or are they tailored for use by a specific department or faculty)?

  3. What is the current state of the “community of practice” established by faculty members, graduate students, and affiliated researchers who employ digital data, tools, and methods as part of their work?

  4. What outcomes for digital scholarship practices would faculty members, graduate students, and library staff like to see?


Action research: As a broad field of inquiry with conflicting definitions and interpretation (Rowell, Riel, & Polush, 2017), action research has recently been framed by the American Educational Research Association’s Action Research Special Interest Group as a set of defining characteristics, which begins:

Action research seeks transformative change through the simultaneous process of taking action and doing research, which are linked together by critical reflection. Action research practitioners reflect upon the consequences of their own questions, beliefs, assumptions, and practices with the goal of understanding, developing, and improving social practices. This action is simultaneously directed towards self-change and towards restructuring the organization or institution within which the practitioner works.[60]

The non-objective standpoint of the researcher, who is connected to the community or issue under study, is specifically acknowledged as one of the practice’s defining characteristics.[61]

Appreciative Inquiry: A theory of change that intentionally sets aside principles of problem-based management to focus on questions that mine and improve upon existing strengths. The term was first coined by David Cooperrider in 1980, and relies on an action-research approach to drive its methods (Cooperrider & Whitney, 1999).

Digital scholarship: The “use of digital evidence and method, digital authoring, digital publishing... and digital use and reuse of scholarship” including research and publication including "print and web-based text, video, audio, still images, annotation, and new modes of multi-threaded, nonlinear discourse.[62]

Data Collection Procedures

To create a complete picture of digital scholarship practices across the university, three populations at York University will be consulted, with qualification to participate dependent on each person’s previous, current, or intended use of digital tools or methods for teaching, learning or research:

  1. Graduate students at the Master’s or PhD level, as well as postdoctoral fellows

  2. Faculty and other scholars involved in research and teaching

  3. Library staff

For all three populations, a series of semi-structured interviews are proposed. The information gathered from these interviews will guide the design of a more focused quantitative survey that will be used, in a second phase of data collection, to validate and reinforce interview findings.

Questions for each interview will follow an appreciative inquiry (AI) model in pursuit of the study’s research questions but will vary slightly with each population. The AI model provides a “4D” approach for research that encompasses four stages: discovery, dreaming, design, and delivery. For the purposes of strategic needs assessment, interviews will identify strengths and opportunities using the first two stages:

  1. Discovery: what are the current areas of strength in your area? Who is performing this work and what makes it successful?

  2. Dream: what might be possible in the future?

The latter two phases of the AI model—design and delivery—will be implemented outside the scope of this project, as part of tactical and operational program design.

The choice of a strengths-based approach for this research is intentional. It has been selected in response to criticisms of more typical deficit-based or problem-based approaches to needs assessment and program planning. For example, Hannum[63] outlines a number of unintended consequences of deficit-based approaches including an erroneous focus on what organizations do poorly; this prioritizes correcting weaknesses over amplifying strengths, and can create feedback mechanisms (after Arthur[64]) that cause old patterns to become entrenched and reinforced rather than disrupted. Framing a needs assessment as a “problem” may impact staff morale, as well, with the suggestion that previous efforts have been insufficient. In the case of York University Libraries, there have been previous initiatives to take a more strategic approach to digital humanities and digital scholarship work. A strengths-based approach celebrates the achievements of those efforts and allows their strongest ideas to resurface as part of the proposed consultation.

Our qualitative interview instrument was constructed based on models provided by other institutions (Brenner, 2014; Lindquist, Long, Watkins, Arellano, & Dulock, 2013; Maron & Pickle, 2015; Lippincott, 2017; University of Calgary, n.d.).

Existing case studies and other literature have suggested some alteration of the interview questions used for other projects. First, following the experience of Mitchem & Rice, terms like “digital scholarship” will be avoided wherever possible. Survey results from their needs assessment work showed that many digitally-engaged faculty members did not understand or see themselves in the term; accordingly, the interviews will focus on areas of teaching and research (including scholarly communications) where technology plays a role. Second, where other interviews have divided questions broadly in terms of teaching and research (with similar questions asked, in sequence, for each category), the proposed instrument places its focus on methods and tools. It is hoped that this alignment will allow responses that highlight the interconnected nature of research, teaching, and communication. Third, specific prompts have been added to the interview questions to encourage participants to address their use of technology as part of thesis candidate supervision and other activities not directly related to a course or research project.

The primacy of the semi-structured interview format follows recommendations that specifically advise against reliance on of general surveys for strategic needs assessment (ECAR Working Group, 2017, p. 6; Watkins et al., 2012, p. 36).

Appropriate recruitment for this study is of critical importance. The following methods are proposed for sampling the faculty and graduate student populations, with the aim of finding 12 participants in each population:

  • A review of topics represented in recent Tri-Agency grant awards to York University researchers

  • Strategic searches through recent archives of yFile, York University’s “journal of record” for ongoing research activity and related news stories (York University, n.d.)

  • A review of York University’s web presence, with a focus on programs and departments that may have digital elements

Reflecting the assertion that digital methods and tools are ubiquitous in contemporary archives and libraries, up to eight participants for library staff interviews will be recruited from across the York University Libraries. Purposive sampling will be used to ensure that all the organization’s units, divisions, and disciplinary clusters are represented. Archivists, liaison librarians, and those providing reference services and research support may also be used to recruit additional graduate students and faculty members through a snowball sampling approach.

Wherever possible, interviews will be scheduled to take place in participants’ on-campus work areas. Consent will be requested to permit audio recording of interview sessions, with notes being used as a back-up method. It is anticipated that no interview will take longer than 60 minutes. In cases where consent to record is not granted, only the interviewers’ notes will be used. For each class of participants, initial interviews will be used as a “pre-test” that will be used to inform possible revisions to interview questions.

Once transcribed, interview data will be coded using a grounded theory approach of open, axial, and selective coding (Corbin & Strauss, 2015).

It is hoped that survey invitations, sent in the latter half of the study to the broader population of identified candidates, will result in 30-50 responses. A draft version of the planned survey has been created.

Though the specific responses to multiple-choice questions may be altered as a result of interview findings, the general formof the survey will not differ from the draft version.

Survey participants will be solicited with an email outlining the details of the study, alongside a date range for the survey. A link to the survey will be provided, and an online consent form will appear at on the survey's first page.
The survey is designed to be completed in about 15 minutes.


There are three main limitations to the design of the study as proposed:

  1. Without having gone through the process of identifying possible interview subjects with the methods outlined in the previous section, it is impossible to know whether the target of 12 interviews for each of the faculty and graduate student populations will be enough to fully-describe the range of digital scholarship activity on campus. Relying on existing social networks (for example, current relationships between York University Library staff and researchers) may miss work about which the Libraries is unaware; but relying on Tri-Agency grant information and yFile archives may bias selection towards work prioritized by those agencies or (in the case of yFile) projects the represent “success stories.” Moreover, the selection strategy employed by this study will overlook graduate students and faculty members who are interested in digital tools and methods but have not yet employed them. Because the actual size of each population is unknown, it will be difficult to know when a saturation point has been reached.

  2. Interest in the topic of digital scholarship is as broad as the discipline itself, and though the preference for this study is to use librarian and archivist staff as interviewers, it is likely that interest in this research will be strongest among those who would otherwise be the target of an interview. This will be mitigated by making interested staff members aware of this conflict so that they can self-identify as being more interested in either data collection or work as a participant.

  3. Two of the four interview guides (University of Calgary (n.d.) and University of Pittsburgh (2014)) used to inform the interview questions were derived from the other two guides (Colorado University at Boulder (2013) and Maron & Pickle (2015)). Care must therefore be taken to think critically about the source and goal of each question in order to avoid introduction of bias.

Data Management

A complete data management plan has been developed using the Portage Data Management Plan Assistant (Portage Network, n.d.) and is available online.

Interview data and all other study documents will be stored on a shared network location, in an encrypted and password-protected format, with access restricted to researchers and transcribers. Audio files will be kept after transcription is finished, but will be destroyed at the conclusion of the research project.

Survey data will be kept in a password-protected Google account, with access restricted to only the primary researcher, for the duration of the survey collection window. When the survey closes, data will be exported to CSV format and transferred to the project's encrypted, password-protected storage location.

Interview transcripts will be de-identified with the removal of all personally-identifiable information, including interviewees' references to specific research projects, other collaborations, or information sources. Responses to questions about other possible interviewees (snowball recruitment) will not be kept in transcripts. The project will further ensure the accuracy and anonymity of transcriptions by supplying transcribed text to the interviewee and providing them the agency to remove other potentially-identifying information.

There will be interim steps required for handling and processing of interview audio files (for example, audio and notes captures at source during an interview). These files will be stored on encrypted devices (e.g. laptops with full disk encryption). Wherever possible, audio will be recorded directly to these devices as well. In cases where an external audio recorder is used, audio recordings will be transferred to a laptop or portable hard drive featuring full disk encryption before leaving the interview site. The integrity of the copied audio file will be verified and then the original, unsecured recording will be deleted.

Key considerations are the use of central, secure storage for the collection of audio files and transcripts; mechanisms for ongoing consent concerning the use and licensing of interview transcripts; and a commitment to build a final data package, using the DDI Framework (Data Documentation Initiative, 2019), that will be stored in Scholars Portal Dataverse (Scholars Portal, 2019).

Ethics Review

In order to allow for the outputs of this study to be usd in the creation of research presentations and papers, advance approval has been obtained from the Office of Research Ethics (ORE) at York University. This included a review of consent letters and procedures, confidentiality agreements, and data management plans. Approval from the ORE was received on November 20, 2019.

If the focus of this research were restricted to the purpose of program development for the York University Libraries, ethics approval would not be required. However, this study’s ability to address multiple gaps in the literature (including a low number of case studies related to digital scholarship needs assessment in a Canadian context, and scarce examples of strengths-based needs assessment methods used for this purpose) makes it worthy of more broad dissemination as a case study. The results of the research will likely include quotations from interview participants, and the study would like to make transcripts of the interviews available under an open, attribution-based content licence.

Participants will be de-identified for the purposes of research dissemination, with findings presented in aggregate wherever possible. Since the results of this study may be disseminated through publications, conference presentations, and as-yet-undetermined channels, consent for each output will be requested from participants on an ongoing basis, with draft versions shared in advance of any release to the public.

The topic of this research is not sensitive but there may be secondary connections, through data collection, to other research of a sensitive or competitive nature that is taking place at York University.


Arthur, W. B. (1990). Positive Feedbacks in the Economy. Scientific American, 262(2), 92–99., A. L. (2014). Audit of ULS Support for Digital Scholarship: Report of Findings and Recommendations [Monograph]. Retrieved from

Cooperrider, D. L., & Whitney, D. (1999). Appreciative Inquiry: A Positive Revolution n Change. In P. Holman & T. Devane (Eds.), The change handbook: Group methods for shaping the future (1st ed). San Francisco: Berrett-Koehler Publishers.

Corbin, J. M., & Strauss, A. L. (2015). Basics of qualitative research: Techniques and procedures for developing grounded theory (Fourth edition). Los Angeles, CA: Sage.

Data Documentation Initiative. (2019). Welcome to the Data Documentation Initiative. Retrieved September 24, 2019, from

ECAR Working Group. (2017). Building Capacity for Digital Humanities A Framework for Institutional Planning.

Hannum, W. (2013). Questioning Needs Assessment: Some Limitations and Positive Alternatives. Educational Technology, 53(6), 29–34.

Leigh, D., Watkins, R., Platt, W. A., & Kaufman, R. (2000). Alternate models of needs assessment: Selecting the right one for your organization. Human Resource Development Quarterly, 11(1), 87–93.<87::AID-HRDQ7>3.0.CO;2-A

Lindquist, T., Long, H., Watkins, A., Arellano, L., & Dulock, M. (2013). Dh+CU: Future Directions for Digital Humanities at CU Boulder. University Libraries Faculty & Staff Contributions, (32).

Lippincott, Sarah Kalikman. “Digital Scholarship at Harvard: Current Practices, Opportunities, and Ways Forward,” June 22, 2017.

Maron, N., & Pickle, S. (2015). Sustaining the Digital Humanities: Host Institution Support Beyond the Start-up Phase.

Portage Network. (n.d.). DMP Assistant. Retrieved September 24, 2019, from

Rowell, L. L., Polush, E. Y., Riel, M., & Bruewer, A. (2015). Action researchers’ perspectives about the distinguishing characteristics of action research: A Delphi and learning circles mixed-methods study. Educational Action Research, 23(2), 243–270.

Rowell, L. L., Riel, M. M., & Polush, E. Yu. (2017). Defining Action Research: On Dialogic Spaces for Constructing Shared Meanings. In L. L. Rowell, C. D. Bruce, J. M. Shosh, & M. M. Riel (Eds.), The Palgrave International Handbook of Action Research (pp. 85–101).

Rumsey, A. S. (2011). Scholarly Communication Institute 9: New-Model Scholarly Communication: Road Map for Change. Charlottesville, VA: Scholarly Communication Institute and University of Virginia Library.

Scholars Portal. (2019). Scholars Portal Dataverse. Retrieved September 24, 2019, from of Calgary. (n.d.). Aligning the stars: Understanding digital scholarship needs to support the evolving nature of academic research.

Watkins, R., Leigh, D., Platt, W., & Kaufman, R. (1998). Needs assessment—A digest, review, and comparison of needs assessment literature. Performance Improvement, 37(7), 40–53.

Watkins, R., West Meiers, M., & Visser, Y. (2012). A Guide to Assessing Needs: Essential Tools for Collecting Information, Making Decisions, and Achieving Development Results.

York University. (n.d.). YFile. Retrieved September 20, 2019, from

Implementation Plan

A project breakdown and implementation plan is outlined in this Gantt chart.

Minglu Wang: Hi Kris, I think your broader and more holistic view of “digital scholarship” is very critical, and I want to share with you a “research support” Mendeley Library where I gathered some literatures on institutional and library research support over the last couple of years whenever I bumped into a relevant article. It’s not a systematically compiled library, but might be able to add some more perspetive to your “digital scholarship” body of literature. Just share with you a few observations and thoughts of the general research support literature at the academic library that I have: the potential and need for a more integrated research support across multiple departments within the institution; the institution/library could move from a supporter to an anabler; we need a better understanding and alignment with the broader context and trends of the academia: eScience, enaged scholarship, social impact / needs, ethics and integrity, faculty professional development … Then, just some idea parcicularly to this project: Would you think it is worthwhile or practical to go a bit beyond the focus of “tools and methods” perspective? Or maybe it’s just the interview questions make it feel like a bit restricted to “tools and methods”, implying an assumption is that library’s expertise is aligned better with “tools and methods”. Since it’s a semi-structural interview, I believe the discussions will definitaly provide insights beyond “tools and methods”, but maybe it worthwhile to add some guiding questions or prompts that are directly related to researchers’ understanding of and motivations to address the most critical research topics in their fields and the accelarators and catalysts within and external to the research institutioon in that fields. The “Dream” question touch upon this, maybe move it to the beginning to set up the tone of the conversation that we are interested in areas not limited to “tools and methods”? Along this line, some wording, for example, skills, literacy, practices, habits, mindset, education, life-long learning, conversation, exchangfe of ideas, and etc, to compliment the “tools and methods” and “community of practice”? Some additional roles to consider for faculty researcher? For example, they are also mentors of graduate students and even mentors of younger faculty researchers in their department and even in their fields nationally and internationally. Again, just some personal thoughts to share, may not fit well with this particular project. Just let you know again that I think this is a great project and I believe it will be very impactful, both pracically and theoretically. Good job!
Kris Joseph: Thank you, Minglu - this is excellent feedback! I settled on the term “tools and methods” out of laziness, I think, and I have been worried that it is too restrictive. I have similar thoughts about separating “teaching” and “research” because it suggests that the two can’t happen at the same time (e.g. research as teaching!) or that those are the only two activities. What about ways scholars communicate and collaborate with each other more generally, more example? I really like the wording you’ve suggested and I think it would be easy to adapt the scripts to remove reliance on the “tools and methods” term.
+ 1 more...