Now Available: Setting a Foundation for Assessing Content Reuse White Paper

The project team is now happy to release Setting a Foundation for Assessing Content Reuse: A White Paper From the Developing a Framework for Measuring Reuse of Digital Objects project. This white paper (a) provides a broad overview of the Measuring Reuse project, including background information on the AIG, (b) outlines the methods used by the project team, (c) summarizes results, and (d) discusses potential next steps.

The release of the Setting the Foundation marks the conclusion of the Developing a Framework for Measuring Reuse of Digital Objects grant project.

Interested in learning more about the project over the coming months?

Members of the project team will be reporting out on the results of this project at conferences such as the DLF Forum, the Library Assessment Conference, and the Grey Literature 20 Conference. An article summarizing the first part of the project titled “Barriers and Solutions to Assessing Digital Library Reuse: Preliminary Findings” is being published open access in Performance Measurement and Metrics. You may also contact any project team member to learn more about the background, results, and next steps for the project.

Call for Follow-Up Survey Participation

The release of this follow-up survey is the Measuring Reuse team’s final step in our needs gathering process.  The survey consists of 14 questions. It asks participants to prioritize a set of use cases (generated from previously held focus group sessions) to identify the most useful functions of a digital library reuse assessment toolkit.

If you are a cultural heritage or research data professional interested in methods of evaluating reuse of your institution’s digital object collections, consider taking our survey: http://uhlibrary.qualtrics.com/jfe/form/SV_9ZcWbqMxH3MNhsN

If one of the first 50 respondents to complete the survey, participants will also be offered a $25 Amazon gift card for their participation in the survey.

Please see the survey itself for full eligibility criteria and informed consent. The survey ends at 5:00pm PDT on Friday, June 15, 2018.

Please do share on your social media networks! It would be very much appreciated.

Thanks!

Analytics, Altmetrics, and Reuse Twitter Chat — April 25

Members of the the Digital Library Federation Assessment Interest Group’s (DLF AIG) Working Group on Reuse and the Library Information Technology Association (LITA) Altmetrics and Digital Analytics Interest Group are excited to announce the upcoming “Analytics, Altmetrics, and Reuse Twitter Chat” on Wednesday, April 25 from 2:00 p.m EST to approximately 3:00 p.m. EST.  The chat will engage participants in discussing the numerous approaches and barriers to reuse analysis in digital libraries.

Below you will find important information about the Twitter Chat, background content for participating in the conversation, and the questions to be explored during the chat.

Twitter Chat Planning Document: https://bit.ly/2Hfe0Ry

Hashtags: #dlfaig, #digreuse

Context/additional information:

Definitions:

Reuse: The DLF AIG Reuse Working Group defines reuse as how often and in what ways digital library materials are utilized and repurposed. In this definition, we do know the context of the use. This definition is fluid and open to change.

Examples of reuse: turning images into gifs or memes; inclusion of digital collection materials in an external dataset (e.g. HTRC datasets or curated Internet Archive user collections); mashups of two or more songs or video; data visualizations; other transformative applications of the collections external to digital collections systems

Examples of reuse assessment: reverse image lookup information; citation metrics of data and/or digital collection materials

Questions:

Q1: How would you define #altmetrics and digital #analytics? #dlfaig #dlfreuse

Q2: Do you agree or disagree with the reuse definition: https://bit.ly/2Hfe0Ry? #dlfaig #digreuse

Q3. Is reuse measured at your institution? If so, how? #dlfaig #digreuse #datareuse

Q4: What reuse assessment data would be helpful to collect? Some potential examples might include collecting campaign URLs or identifiers, patron requests/surveys, etc? #dlfaig #digreuse

Q5. What are common tools used to collect and analyze reuse assessment data? #dlfaig #digreuse

Q6: What do you do with this assessment data? How do you communicate the results of your analysis? #dlfaig #digreuse

Q7: What assessment standards are used? What limitations exist with those standards? #dlfaig #digreuse

Q8: What topics would you like to see in a future conversation around reuse assessment data collection and analysis? #dlfaig #digreuse

Behind the Scenes: What it takes to make us stage ready

By Caroline Muglia

When this rag tag group of librarians and archivists embarked on a project that would evolve into an IMLS grant-funded endeavor, we knew at least one thing: we’d be learning a lot throughout this year-long process. Early on, we embraced an iterative approach to our conversations, brainstorming sessions as well as deliverables and outputs we set for the grant.

This post will share a few of the areas where we acted, assessed, pivoted, iterated, and tried something new! We’re constantly finding ways to improve our work and learn from past practice. We hope this behind the scenes insight can be helpful to those readers pursuing similar goals in a collaborative environment, or highlight areas worth interrogating further in your ongoing work.

Get grounded in Grounded Theory

When we first began this grant, we knew we’d generate a lot of data. What we didn’t anticipate was the breadth of the backend process necessary to do something valuable with the data. We embarked on qualitative coding projects to organize what our experts are telling us and take actionable steps that will drive our re-use toolkit criteria. Since none of the six of us came aboard this project with secret coding experience, we had to learn together.

After an environmental scan and a few demos, we decided to utilize Dedoose qualitative coding software for the notes we generated during the in-person and virtual focus group sessions (our grant specifies that any recording or transcription be permanently deleted within 48-hours). As a group, we discussed advantages and scrutinized challenges to different qualitative models together until we agreed to use a Grounded Theory. In the coming month, we’ll complete the initial and final coding. We referenced our sources on Grounded Theory and regularly iterated and improved our output in service of our grant deliverables.[1] In this instance, we started with an idea that our data would be valuable and ended with knowledge of new software and conversational literacy of grounded theory and a darn good qualitative analysis.

Sending surveys far and wide

In addition to the tailored focus groups, we also developed a survey instrument to capture experiences with and perspectives on re-use in a digital library environment. Distributed via email lists, we gathered 302 responses. Our first thought was, How great, this small group of people churned out so much interest in this topic! Our second thought was, Yikes, how do we make sense of it all? Again, we iterated: we gathered the data using Qualtrics and assigned a few folks to dig through the results and propose a plan. We poked holes in the plan, that strengthened the goals and methodology, which we ultimately applied to the analysis of the insightful responses.

Befriend people smarter than you

We surrounded ourselves with a lot of smart people! This included the development of an engaged advisory board , and a focus on professionally active participants for those tapped to participate in both the in-person and virtual focus groups. The focus group participants discussed concepts of “use” and “re-use” of digital objects in their cultural heritage institutions, and the advisory board has helped us to shape, and think through, these community conversations. To date, we have hosted one series of in-person focus groups, and one series of virtual focus groups (with another round of each to come). We used a facilitation guide in the first round of in-person meetings, which allowed the different groups to stick to an agreed-upon script and promote discussion around the same topics.

After the first meeting, we assessed the discussions and made some changes to the facilitation guide for the virtual focus group meeting. We cut down the conversation on “use” and ramped up the one on “re-use” since that’s the focus of our grant. We also shortened the virtual focus group with the expectation of wavering attention spans for a phone conversation. In our next round of focus group meetings, we’re also making some adjustments that we hope will benefit the development of the toolkit criteria.

Indeed, in some areas of our organization and collaboration, we have strengthened what we’ve been doing well all along!

Meet regularly and do stuff

Since the group is scattered across the USA, it is imperative for us to stick to a regular meeting schedule and to maximize our time in those meetings. Our ever-organized PI, Santi Thompson,  distributes an agenda before the meeting including any links or materials needed to have an informed conversation. By the end of the meeting, we aim to have clear action items, deadlines, and benchmarks, and try to distribute the work evenly among the six of us.

Sub-groups are great

Within our group, we use sub-groups (of 1-3 people) to tackle mid-term length issues or examine larger concepts related to the project. This allows us to keep moving forward at an aggressive pace, while also learning as much as possible from our collaborators.

Phone a friend

When we need advice, guidance, or support, we ask for help from DLF or our Advisory Board. We didn’t get involved in this grant because we know everything! We got involved because we wanted to learn from each other and from experts already working in the field.

Know your capacity

This sounds a bit self-help, but what allows us to succeed is knowing our own capacity. With a large group, so many deliverables, and a lot of unknowns, we’ve succeeded in taking on more work when we have the capacity, and voicing our need to take on a supporting role when our capacity or attention shifts. Since we all have day jobs, personal lives, and other responsibilities, it’s up to us to say, Nope, I can’t present at that conference because I have a work deadline, or I enjoy writing blog posts, I can take on this assignment! This also gives us more confidence that we aren’t burdening each other with projects and deadlines.

Laugh

Like, belly laugh. Guffaw, even. It helps. It always helps.

[1] Charmez, K. (1983). The grounded theory method: An explication and interpretation. RM Emerson, Contemporary Field Research: A Collection of Readings, 109-126; Holton, J. A. (2007). The coding process and its challenges. The Sage handbook of grounded theory, (Part III), 265-89.

 

 

 

 

Upcoming Activities: February 2018

Happy New Year!

The project team is hitting the ground running in 2018 as we prepare for several upcoming data collection sessions and conference presentations.

Metadata Interest Group – ALA Midwinter Meeting 2018: Ayla Stein will be presenting on behalf of the project team at ALA Midwinter 2018. The presentation is included as part of the Metadata Interest Group’s meeting programming at 8:30AM on Sunday, February 11, 2018.  Abstract to follow.

Presentation @ Code4Lib 2018Liz Woolcott will be presenting for the project team at Code4Lib, taking place this year in Washington, D.C. The presentation is scheduled for Thursday, February 15, 2018 at 1:10PM. Please see the session description for the complete presentation abstract.

Focus groups @ Code4Lib 2018: Also at Code4Lib we will be holding two in-person focus groups that will be held immediately after the close of the conference. These are our second set of in-person focus groups and are a primary method of data collection.

That’s it for now.  If you’re attending either ALA Midwinter or Code4Lib we hope to see you there!

 

Digital Libraries or Digital Objects: Measuring the Impact and Value of Use and Reuse

This invited blog post was contributed by Ali Shiri, Professor at the School of Library and Information Studies, University of Alberta. 

The emergence of large scale digital libraries and repositories such as HathiTrust, the Internet Archive, the Open Library, the Digital Public Library of America, Europeana and the World Digital Library, provide new opportunities for digital information users to openly and freely interact with a broad range of digital objects. The unprecedented availability of massive digital collections of books, manuscripts, images, photos, and maps offers new, individual and collective ways of making sense of information and of creating digital content.

The implications of using, reusing and repurposing digital content and collections, in this open and information rich environment, are profound and multifaceted, cutting across many different institutional
contexts such as libraries, archives and museums, as well as numerous disciplines and a wide range of user communities and audience types. Access to digital libraries, in particular, has been closely associated with and discussed in regards to such measures of impact, value and usefulness. In fact, a cursory glance at many digital library evaluation models that have been developed in the past two decades, demonstrates the importance of impact assessment and usability of digital libraries. In today’s world, digital information users, academic as well as the general public, are able to make use of digital libraries to read, explore, entertain, write, research, create and contextualize. Users engage in a diverse range of information practices and tasks, including searching, retrieving, using, learning, conceptualizing, synthesizing, presenting and disseminating. The use and reuse of digital objects is at the centre of all these activities. Teachers use digital libraries to support learning and instruction. Researchers make use of digital information and digital research objects to support the production and dissemination of new knowledge. Digital humanists make use of digital objects both as learning and research artifacts.

In line with these developments, the area of digital public scholarship is emerging as many disciplines within the humanities and social sciences promote the creation and use of digital artifacts and objects, not only by researchers and scholars but also by the general public. Community archives and numerous digital humanities projects that generate digital data and artifacts open a new horizon for the general public to develop digital literacy and digital fluency skills. Thanks to the availability of a diverse range of tools for using and modifying digital objects, digital information users are now capable of creative, innovative and novel use of digital objects to support discovery and exploration.

Assessment of the value, impact and usability of digital objects requires a holistic and multidimensional framework that takes into account use, reuse and repurposing of digital objects. While user evaluation studies have contributed significantly to how we measure, assess and evaluate the value of digital libraries as a whole, we need to be able to demonstrate the value of digital libraries at the digital object level as well. There are quantitative measures associated with the use and reuse of digital objects such as the number of clicks, downloads, bookmarks, views, likes, and items shared and saved. However, these measures only provide a general indication of impact and value. We need measures that document and demonstrate the quality, extent and nature of the use and reuse of digital objects in relation to such facets as the context, discipline, collection that the object belongs to, geographic origin, time period and the nature of task or the information-bearing object that contains the used or reused item. This will call for a holistic assessment framework that addresses various aspects and components of use and reuse as well as techniques, tools and technologies that support digital object use and reuse. Development of such a framework should be evidence-based, empirically-supported and should be based on qualitative and quantitative data, use cases, and user evaluation studies that involve diverse user groups and target audiences. A typology of reuse cases may, for instance, include digital object reuse in the context of educational, cultural, linguistic, artistic, historical, chronological, geographical, and genealogical research and exploration. The value and impact of digital objects and their reuse should be conceptualized not only as part of the scholarly communication lifecycle but also as part of lifelong learning and recreational experiences and activities of digital information users and searchers. It is timely for the digital library community to ensure and promote the relevance and usefulness of digital libraries by providing new frameworks and measures that evaluate and assess the impact of use and reuse of digital objects in relation to intellectual and artistic creativity as well as to informed citizenry, social responsibility and democracy.

Let the (early) data analysis begin!

The project team is fresh off its first successful presentation at the 2017 DLF Forum in Pittsburgh. Genya O’Gara formally introduced our team and project during a session of the DLF Assessment Interest Group’s yearly accomplishments. Following the forum the project team convened our first in-person focus group. Eight individuals kindly shared their thoughts and perspectives on the complexities and challenges of understanding content reuse and the promising possibilities that reuse brings to digital cultural heritage assessment. On the heels of this first focus group was also the completion of our survey. We were delighted to receive over 200 responses (and were equally excited to provide $25 Amazon gift cards to the first 50 respondents). As the team continues to plan future focus groups, we will also be combing through the data and, with guiding insights from our Advisory Board, formulating initial results. This work will ultimately lead to a series of use cases and functional requirements for a future assessment toolkit.

The results of all of this analysis will be shared widely with the community. The earliest reveal of our analysis will be discussed at several upcoming conferences, including ALA Midwinter in Denver. The team will update you all on other presentations as they are confirmed.

Call for Survey Participation

Our first step in our needs gathering process is to release a survey that seeks to identify how cultural heritage organizations currently assess digital library reuse, any barriers to assessing reuse of digital objects, and to begin pulling community priorities for potential solutions and next steps together.

If you are a cultural heritage or research data professional interested in methods of evaluating reuse of your institution’s digital object collections, consider taking our survey: http://uhlibrary.qualtrics.com/jfe/form/SV_bKNLEtMwORvQmJD.

Please see the survey itself for full eligibility criteria and informed consent. The survey ends at 5:00pm EDT on Wednesday, October 11, 2017.

Please do share on your social media networks! It would be very much appreciated.

Thanks!

We’re growing!: Welcoming 6 Advisory Board Members

We are delighted to announce the formation of our new Advisory Board, comprised of expert stakeholders from cultural heritage organizations throughout the United States of America and Canada.  The Board will provide critical feedback on methods used for data collection analysis and guidance in drafting final recommendations. The Board will also provide direction in identifying digital libraries and other institutions that could benefit from a “reuse toolkit.” (The functional requirements and use cases for a toolkit is the primary output of this grant.)

The project team identified Advisory Board members based on their experience in building and administering digital collections as well as in researching and teaching on digital library issues, and their ability for a granular understanding of the functions of digital libraries as well as a broad vision to shape this toolkit.

Advisory board members will serve for one year. During that year, the project team will host three virtual advisory board meetings, with the remainder of the feedback gathered through email and conference calls.

You can read more about the interests and professional accomplishments of each board member on the Advisory Board page.

Please join us in welcoming the Advisory Board!

Advisory Board members:

 

Launching the project

Welcome to the IMLS funded grant project Developing a Framework for Measuring Reuse of Digital ObjectsYou can find out more about the project aims on the About page. Over the next year, readers can expect to see a series of posts to the Measuring Reuse website that follows the team’s progress. Along the way, we hope to include reflections and perspectives on reuse from digital library practitioners. Readers are encouraged to watch this space to learn more about the work and to explore the boundaries of reuse with the team.

The grant aims to gather input from those working in digital libraries by conducting a formal needs assessment. This work will inform the development of a “reuse toolkit” to promote the adoption of standardized reuse measurement in digital libraries. The toolkit will highlight sustainable assessment techniques and best practices for communicating the impacts of digital collections. The grant outputs are intended to support cultural heritage institutions in developing collections through a nuanced understanding of how digital library materials are being used.  

For our first post, we wanted to share our early efforts. The grant project began on July 1, 2017. Since then, we have assembled an advisory board (more soon), navigated the complex world of institutional review board (IRB) submission and approval, and compiled a survey that the team will use to develop a baseline for understanding how cultural heritage organizations view and incorporate reuse into their assessment practices. The team expects to solicit survey responses  in the coming weeks using a variety of listservs. Targeted lists will be those that intersect with the topics of digital libraries/repositories and cultural heritage organizations and their communities’ interests. The team will be offering $25 to the first 50 individuals who complete the survey.  If you are interested in participating in the survey please watch for the invitation or feel free to contact any member of the project team for more information.

With our first month completed, the team is looking forward to planning and executing other elements of the grant – including focus groups and conference presentations – over the next year. Check back to learn more about these efforts in future posts.

Thanks!

 

 

Skip to content