Point-of-Use Surveys

Definition

Point-of-use surveys are a type of intercept survey. Intercept surveys are a research method used when the members of a population are unknown (Toepel, 2016). When used in an online environment, intercept surveys take the form of either embedded surveys or pop-up surveys. 

When used to assess digital object use/reuse, a point-of-use survey is an online survey triggered by specific user actions on a website, such as loading a webpage or interacting with an element on a webpage (e.g., downloading a digital object). 

Applications for assessing digital content use/reuse

Point-of-use surveys typically ask a brief series of questions to users undertaking specific actions on a website. The goal is to learn more about the intended purpose for which users are undertaking those actions, and/or demographic information about users who undertake those actions. Such surveys typically only include a few questions and are optional. One of the most useful applications of this method for assessing digital object use/reuse would be to trigger a point-of-use survey when a digital object is downloaded, though not all tools allow such precise triggers. Other applications might include triggering a survey when a digital object is viewed or when users open web pages from which they are likely to download digital objects.

A point-of-use survey may be offered as an embedded survey or a pop-up survey. An embedded survey is located in a specific place on a website where users must actively seek it out. Often, those who feel compelled to do so are not a representative sample of visitors and may typically have strong opinions they want to share. 

A pop-up survey, or a pop-up survey invitation, is triggered by predefined actions, appearing on-page and only to users who meet a defined criteria. While this may produce a higher number of responses, as well as responses more representative of the targeted population, pop-ups are considered more intrusive than embedded surveys. For this reason, cookies are typically used to ensure that a user only sees the survey one time. If they have already answered, declined, or closed the survey, cookies will stop the pop-up from reappearing. Pop-ups can be run for limited periods of time, and depending on the tool used, pop-ups can also be set to appear to only a certain percent of users. 

Most high-quality survey platforms are able to display pop-ups in attractive modal windows that float above the webpage and do not require visitors to load a new page, therefore remaining unaffected by pop-up blockers. These are sometimes interchangeably referred to as “pop-overs.”

Tools

Many popular survey platforms provide pop-up or pop-over survey capabilities, and they can also be custom-built by web developers. While there are dozens of vendor hosted survey platforms, overviews of how to create pop-ups with three popular platforms can be found in this toolkit.

Supplemental materials

The D-CRAFT project team has created this sample point-of-use survey for digital collections to assist practitioners in conducting this method.

Ethical guidelines

Practitioners should follow the practices laid out in the “Ethical considerations and guidelines for the assessment of use and reuse of digital content.” The Guidelines are meant both to inform practitioners in their decision-making, and to model for users what they can expect from those who steward digital collections.

Additional guidelines for responsible practice

Before requesting personally identifying information from respondents in a pop-up survey, consider whether the information is necessary. If it is, request only as much information as is needed. If possible, separate identifying information from survey responses after data collection and follow recommended practices for storing and protecting personal data. If Internet Protocol (IP) addresses are automatically collected and stored by the survey software, discard the data. IP addresses are widely considered to be personally identifying information. Explain your privacy practices to participants upfront, including whether the survey is anonymous and how personal data and other survey data will be protected, shared, retained, and managed.

If you work in a college, university, or other institution for higher education and intend to share the results of your survey beyond solely internal review and use, then you will likely need to contact your campus research board for human subjects research, frequently called “Institutional Review Boards” or IRB for short.  The IRB is a critical resource and enforcing authority for ensuring that researchers follow ethical guidelines and protocols for human subject research. The IRB will review your survey plan and then authorize your survey. They also provide detailed guidelines for responsible practice, such as the aforementioned practices for gathering minimal, if any, identifying information. This IRB process only applies, however, if you work in higher education.

Strengths

  • Point-of-use surveys are one of the only ways to gather information in the moment from and about users of digital objects, such as demographics about users or what they plan to do with the digital objects. Because users of digital content are typically anonymous, a point-of-use survey is virtually the only way to contact every user who downloads a digital object from a website. The user can remain anonymous, but provide important information to help practitioners understand use/reuse of digital objects.

  • Point-of-use surveys are easier to implement and less time-intensive than many other methods attempting to assess use and reuse of digital objects. While initial survey setup takes a small amount of time, survey distribution is automated and the method will not require more time until the data analysis phase of the assessment. Because the majority of data collected will likely be quantitative (e.g., multiple choice questions and not free-text questions), the time needed to analyze results is also negligible in comparison to time-intensive qualitative methods such as focus groups or interviews; or methods that cannot be easily automated such as reverse image lookup. 

  • Virtually all other data collection methods for digital content use/reuse attempt to pinpoint reuse after the fact. Point-of-use surveys intercept users synchronously as they access digital content.

Weaknesses

  • Because a point-of-use survey gathers data when an object is first viewed or downloaded, data collected reflects only intended use, not actual use. These surveys gather subjective, user-reported data about how someone plans to use/reuse digital objects before that use has occurred and therefore may not accurately reflect eventual use/reuse. If a point-of-use survey is being used to gather demographic data instead of plans for use/reuse, this particular concern is not an issue.

  • Pop-up and pop-over surveys can be invasive. When implementing a pop-up survey, practitioners should follow best practices that limit invasiveness, such as restricting the survey to only appear once per IP address, allowing users to easily dismiss the survey without filling it out, or running the survey for short periods of time only. 

  • Respondents are self selecting. It is difficult to know whether responses are a truly representative sample or if a particular user demographic — or users with certain intended uses — are more likely to respond than others. 

Highlights from projects and scholarship

  • Duke University Libraries’ Digital Collections point-of-use survey
    The survey was part of a larger effort in 2015 to understand scholarly use data to inform digitization decisions, and to better understand if and how the organization was meeting the needs of researcher communities using their digital content. This data collection effort compromised between a pop-up and an embedded survey by adding a small but highly visible survey invitation to the sidebar of the digital collections webpage for six months. While they received far fewer responses than they likely would have with a pop-up survey, the survey was minimally intrusive and also more visible than an embedded survey on a single static webpage. This method did not allow them to link the survey invitation to visitors who downloaded digital objects. Quantitative questions included type of researcher, purpose of the visit, and “what will you do with the images and/or resources you find on this site?” It provided an optional free text response and the option to add personally identifying information if a respondent wanted to be contacted to provide further feedback. 

    Bragg, M. (2015, July 9). Who, Why, and What: the three Ws of the Duke Digital Collections Mini SurveyBitstreams (blog), Duke University Libraries.

    Bragg, M. (2016, March 26). Survey Says: The Who, Why, What Answers you have been Waiting for! Bitstreams (blog), Duke University Libraries.

  • The University of Tennessee’s LibValue project 
    The goal of this study was not to assess use/reuse of digital objects specifically, but to more generally assess the value of digital special collections in support of the University’s goals and to better understand how people make use of digital collections. The study used a short pop-up survey triggered by arrival to the digital collections landing page to gather information about users, as well as to request volunteers who subsequently participated in phone interviews. The survey gathered quantitative information such as type of researcher and purpose for using the website, qualitative free-text questions such as “how has access to this digital collection affected your research?”, and gave respondents the option to add personally identifying information if they were willing to be contacted for a more in-depth phone interview.

    Association of Research Libraries, “LibValue — Digitized Special Collections,” accessed June 13th 2022.

    Association of Research Libraries, “Pop-up survey question tree,” accessed June 13th, 2022, Wayback Machine.

  • Measuring the Impact of Networked Electronic Resources: MINES for Libraries
    The MINES for Libraries project was developed and managed by the Association for Research Libraries (ARL), beginning in 2003 and ending in approximately 2006. The product was a 3-5 question point-of-use survey that gathered data from a sample of users of an institution’s electronic resources such as digital collections, open access journals, and institutional repositories. The participating library was responsible for implementing the technical infrastructure required for the survey. Libraries running EZproxy could locally implement an application that presents the MINES survey to networked users as they initiate an EZproxy session and capture their usage during the session to associate it with their survey responses. Over 50 North American libraries participated in the project. The survey asked users the purpose of use and used EZproxy authentication to determine a user’s location, type of user, and departmental affiliation. The project did not focus on special collections specifically, which are typically open to the public and not limited in use to users who authenticate their identity.

    Association of Research Libraries, “MINES for Libraries,” accessed June 7, 2020.

    Association of Research Libraries. 2005. “MINES for Libraries Final Report.” (see sample survey on page 12)

    Plum, T., Franklin, B., Kyrillidou, M., Roebuck, G., & Davis, M. (2010). Measuring the Impact of Networked Electronic ResourcesPerformance Measurement and Metrics, 11(2).

  • Survey of users to a museum website and digital collections:
    The authors surveyed visitors to The Tate Museum website as they browsed through the site and digital collections.  The authors surveyed users as they interacted with the website and gathered responses on their intentions for using the website and digital collections. They analyzed why and how users interacted with the digital content on the website.

    Villaespesa, E. & Stack, J. (2015, January 30) Finding the motivation behind a click: Definition and implementation of a website audience segmentationMW2015: Museums and the Web 2015.

Additional resources

Fielding, N. G., Raymond M. L., & Blank G., eds. (2017). The SAGE Handbook of Online Research Methods. London: SAGE Publications. 

Toepoel, V. (2016). Introduction to online surveys. In Doing Surveys Online, pp. 1-18. London: SAGE Publications Ltd. 

Contributors

Contributors to this page include Joyce Chapman and Harriett Green.

Cite this page

Chapman, J., Green, H. (2023). Point-of-Use Surveys. Digital Content Reuse Assessment Framework Toolkit (D-CRAFT); Council on Library & Information Resources. https://reuse.diglib.org/toolkit/point-of-use-surveys/

Skip to content