Surveys

Definition

Surveys are structured questionnaires that can be completed on paper or online, typically through a form created by specialized software. 

In the case of the assessment of digital object use/reuse, a survey would typically be distributed in an online format, either to known users (if an institution has access to a list of email addresses for a subset of their special collections or digital collections users) or to unknown users (such as a point-of-use survey).

Applications for assessing digital content use/reuse

Surveys are an excellent method to gather large quantities of quantitative data, as well as qualitative data in the form of free-text comments, with minimal time and effort required from staff. Because cultural heritage institutions typically allow free anonymous access to digital materials, one of the advantages of surveys over other methods is the ability to collect data from these unknown users. In the case of assessment of digital object use/reuse, surveys may ask respondents questions such as whether they have used/reused digital objects and why or why not; about their habits, intentions, and goals using and reusing digital objects; the impact the use may have; or demographics questions about themselves.

Tools

There are numerous platforms available for the creation of online surveys. Commonly used and accessible tools include:

Supplemental materials

The D-CRAFT project team has compiled these samples of survey privacy language to assist practitioners in conducting this method.

Ethical guidelines

Practitioners should follow the practices laid out in the “Ethical considerations and guidelines for the assessment of use and reuse of digital content.” The Guidelines are meant both to inform practitioners in their decision-making, and to model for users what they can expect from those who steward digital collections.

Additional guidelines for responsible practice

Gather as little identifying information as possible. If possible, separate identifying information from survey responses after data collection and follow recommended practices for storing and protecting personal data. If Internet Protocol (IP) addresses are automatically collected and stored by the survey software, discard the data. IP addresses are widely considered to be personally identifying information. Explain your privacy practices to participants upfront, including whether the survey is anonymous and how personal data and other survey data will be protected, shared, retained, and managed. 

If you work in a college, university, or other institution for higher education and intend to share the results of your survey beyond solely internal review and use, then you will likely need to contact your campus research board for human subjects research, frequently called “Institutional Review Boards” or IRB for short.  The IRB is a critical resource and enforcing authority for ensuring that researchers follow ethical guidelines and protocols for human subject research. The IRB will review your survey plan and then authorize your survey. They also provide detailed guidelines for responsible practice, such as the aforementioned practices for gathering minimal, if any, identifying information. This IRB process only applies, however, if you work in higher education.

Strengths

  • Surveys provide a way to collect feedback from a large number of users quickly.

  • Surveys can be deployed in such a way that data can be gathered from unknown users (i.e., when a survey is embedded in or advertised on a cultural heritage institution’s web page or deployed as a pop up or pop over invitation as in a point-of-use survey). This makes surveys one of the only methods that can gather information from unidentified users of digital objects.

  • Surveys are easier to implement and less time-intensive than many other methods of assessing use and reuse of digital objects. While initial survey set up takes a small amount of time, survey distribution is automated and the method will not require more time until the data analysis phase of the assessment. If the survey is structured so that the majority of data collected is quantitative (e.g., multiple choice questions instead of free-text questions), the time needed to analyze results is also less than that required for time-intensive qualitative methods such as focus groups or interviews, or methods that cannot be easily automated.

Weaknesses

  • Depending on how respondents are recruited, respondents may be self selecting. In such instances, it is difficult to know whether responses are a truly representative sample or if a particular user demographic is more likely to respond than others.

  • Unlike methods such as focus groups and interviews, there is no ability to ask follow up questions in anonymous surveys, unless contact information is requested and provided as part of the survey.

  • Large numbers of qualitative (free text) questions are not the best venue for a survey: If you find your questions consist of many free text questions, consider conducting interviews instead with a representative sample of users.

  • Asking users to report on their own behavior can provide less accurate data than methods that track users’ actual behavior (e.g., web analytics, citation analysis). Users often do not remember their own behavior and motivations, or may not choose to accurately report on their own behavior.

Learn how practitioners have used this method

  • Survey to understand historians’ use of primary source materials in research
    This study examines how academic historians search for, access, and use primary source materials in research. Recruited historians completed an online survey about current information practices and potential information needs in archival settings. Results show the methods historians use to search for primary source materials; the types of primary source documents they are most likely to use; whether they access materials online or in person; their use of digitized archival collections; factors they consider important in their decision to use archival collections; and what might prevent them from using collections. The survey instrument is included in the appendices.

    Alexandra, C. (2013). “Historians and the Use of Primary Source Materials in the Digital Age.” The American Archivist, 76(2), 458–480.

  • Survey and semistructured interviews with arts faculty about use of digital collections
    The authors surveyed and interviewed humanities faculty from twelve research universities about their research practices with digital collections.The responses to the survey informed the development of the interview protocol, which asked how faculty use materials from digital collections (including reuse) in their research and scholarly practices. Interviews were conducted by phone and by email. The interview protocol is included in an appendix.

    Green, H. E., & Courtney, A. (2015). Beyond the Scanned Image: A Needs Assessment of Scholarly Users of Digital Collections. College & Research Libraries, 76(5), 690-707.

  • Survey on users to a museum website.
    The authors surveyed users visiting the website of a museum. Though conducted in the early days of website research, this study provides a pioneering analysis of a cultural institution surveying users of their website and digital content.

    Chadwick, J., & Boverie, P. (1999). A Survey of Characteristics and Patterns of Behavior in Visitors to a Museum Web Site. Museums and the Web.

Additional resources

Aery, S. (2015, June 26). The Elastic Ruler: Measuring Scholarly Use of Digital CollectionsBitstreams: The Digital Collections Blog.

Beaudoin, J. E., & Evans Brady, J. (2011). Finding Visual Information: A Study of Image Resources Used by Archaeologists, Architects, Art Historians, and ArtistsArt Documentation: Journal of the Art Libraries Society of North America, 30(2), 24–36.

Beaudoin, J. E. (2014). A Framework of Image Use among Archaeologists, Architects, Art Historians and ArtistsJournal of Documentation, 70(1), 119–147.

Fielding, N. G., Raymond M. L., & Blank, G., eds. (2017) The SAGE Handbook of Online Research Methods. London: SAGE Publications.

Franklin, B., & Plum, T. (2008). Assessing the Value and Impact of Digital ContentJournal of Library Administration, 48(1), 41–57. 

Harris, V., & Hepburn, P. (2013). Trends in Image Use by Historians and the Implications for Librarians and ArchivistsCollege & Research Libraries, 74(3), 272-287. 

Hughes, L. M., Ell, P. S., Knight, G. A. G, & Dobreva, M. (2015). Assessing and Measuring Impact of a Digital Collection in the Humanities: An Analysis of the SPHERE (Stormont Parliamentary Hansards: Embedded in Research and Education) ProjectLiterary and Linguistic Computing, 30(2), 183–198. 

Kelly, B. (2012, December). Evidence, Impact, Metrics: Final Report.

Kelly, E. J. (2014). Assessment of Digitized Library and Archives Materials: A Literature ReviewJournal of Web Librarianship, 8(4), 384–403.

Kelly, E. J., Muglia, C., O’Gara, G., Stein Kenfield, A. S., Thompson, S., & Woolcott, L. (2018). Measuring Reuse of Digital Objects: Preliminary Findings from the IMLS-Funded Project.

Kenfield, A. S., Kelly, E. J., Muglia, C., O’Gara, G., Thompson, S., & Woolcott, L. (2019). Measuring Reuse of Institutionally-Hosted Grey LiteratureGrey Journal, 15(1), 51-58. 

Marsh, D. E., Punzalan, R. L., Leopold, R., Butler, B., & Petrozzi, M. (2016). Stories of Impact: The Role of Narrative in Understanding the Value and Impact of Digital CollectionsArchival Science, 16(4), 327–372.

Marshall, C. C., & Shipman, F. M. (2011). The Ownership and Reuse of Visual Media. In: Proceeding of the 11th Annual International ACM/IEEE Joint Conference on Digital Libraries – JCDL ’11, 157. Ottawa, Ontario, Canada: ACM Press.

Meyer, E. T. (2010). Splashes and Ripples: Synthesizing the Evidence on the Impacts of Digital ResourcesJoint Information Systems Committee (JISC) Report. London: JISC.

Punzalan, R. L., Marsh, D. E., & Cools, K. (2017). Beyond Clicks, Likes, and Downloads: Identifying Meaningful Impacts for Digitized Ethnographic ArchivesArchivaria, 84(0), 61–102.

Rieger, O. Y. (2009). Search Engine Use Behavior of Students and Faculty: User Perceptions and Implications for Future ResearchFirst Monday, 14(12).

Sinn, D., & Soares, N. (2014). Historians’ Use of Digital Archival Collections: The Web, Historical Scholarship, and Archival ResearchJournal of the Association for Information Science and Technology, 65(9), 1794–1809.

Summerlin, D. (2014). Selecting Newspaper Titles for Digitization at the Digital Library of GeorgiaD-Lib Magazine, 20(9/10).

Surveys.” usability.gov, US Department of Health and Human Services. 2006. Accessed 6/16/2020. 

Tanner, S. (2012). Measuring the Impact of Digital Resources: The Balanced Value Impact Model. 113. London: King’s College London. 

Wolf, C., Joye, D., Smith, T.W., & Fu, Y. eds. The SAGE Handbook of Survey Methodology. London: SAGE Publications, 2016.

Contributors

Contributors to this page include Joyce Chapman and Harriett Green.

Cite this page

Chapman, J., Green, H. (2023). Surveys. Digital Content Reuse Assessment Framework Toolkit (D-CRAFT); Council on Library & Information Resources. https://reuse.diglib.org/toolkit/surveys/

See also

Skip to content