Point-of-use surveys are a type of intercept survey. Intercept surveys are a research method used when the members of a population are unknown (Toepel, 2016). When used in an online environment, intercept surveys take the form of either embedded surveys or pop-up surveys.
When used to assess digital object use/reuse, a point-of-use survey is an online survey triggered by specific user actions on a website, such as loading a webpage or interacting with an element on a webpage (e.g., downloading a digital object).
Point-of-use surveys typically ask a brief series of questions to users undertaking specific actions on a website. The goal is to learn more about the intended purpose for which users are undertaking those actions, and/or demographic information about users who undertake those actions. Such surveys typically only include a few questions and are optional. One of the most useful applications of this method for assessing digital object use/reuse would be to trigger a point-of-use survey when a digital object is downloaded, though not all tools allow such precise triggers. Other applications might include triggering a survey when a digital object is viewed or when users open web pages from which they are likely to download digital objects.
A point-of-use survey may be offered as an embedded survey or a pop-up survey. An embedded survey is located in a specific place on a website where users must actively seek it out. Often, those who feel compelled to do so are not a representative sample of visitors and may typically have strong opinions they want to share.
A pop-up survey, or a pop-up survey invitation, is triggered by predefined actions, appearing on-page and only to users who meet a defined criteria. While this may produce a higher number of responses, as well as responses more representative of the targeted population, pop-ups are considered more intrusive than embedded surveys. For this reason, cookies are typically used to ensure that a user only sees the survey one time. If they have already answered, declined, or closed the survey, cookies will stop the pop-up from reappearing. Pop-ups can be run for limited periods of time, and depending on the tool used, pop-ups can also be set to appear to only a certain percent of users.
Most high-quality survey platforms are able to display pop-ups in attractive modal windows that float above the webpage and do not require visitors to load a new page, therefore remaining unaffected by pop-up blockers. These are sometimes interchangeably referred to as “pop-overs.”
Many popular survey platforms provide pop-up or pop-over survey capabilities, and they can also be custom-built by web developers. While there are dozens of vendor hosted survey platforms, overviews of how to create pop-ups with three popular platforms can be found in this toolkit.
The D-CRAFT project team has created this sample point-of-use survey for digital collections to assist practitioners in conducting this method.
Practitioners should follow the practices laid out in the “Ethical considerations and guidelines for the assessment of use and reuse of digital content.” The Guidelines are meant both to inform practitioners in their decision-making, and to model for users what they can expect from those who steward digital collections.
Before requesting personally identifying information from respondents in a pop-up survey, consider whether the information is necessary. If it is, request only as much information as is needed. If possible, separate identifying information from survey responses after data collection and follow recommended practices for storing and protecting personal data. If Internet Protocol (IP) addresses are automatically collected and stored by the survey software, discard the data. IP addresses are widely considered to be personally identifying information. Explain your privacy practices to participants upfront, including whether the survey is anonymous and how personal data and other survey data will be protected, shared, retained, and managed.
If you work in a college, university, or other institution for higher education and intend to share the results of your survey beyond solely internal review and use, then you will likely need to contact your campus research board for human subjects research, frequently called “Institutional Review Boards” or IRB for short. The IRB is a critical resource and enforcing authority for ensuring that researchers follow ethical guidelines and protocols for human subject research. The IRB will review your survey plan and then authorize your survey. They also provide detailed guidelines for responsible practice, such as the aforementioned practices for gathering minimal, if any, identifying information. This IRB process only applies, however, if you work in higher education.
Association of Research Libraries, “LibValue — Digitized Special Collections,” accessed June 13th 2022.
Association of Research Libraries, “Pop-up survey question tree,” accessed June 13th, 2022, Wayback Machine.
Association of Research Libraries, “MINES for Libraries,” accessed June 7, 2020.
Association of Research Libraries. 2005. “MINES for Libraries Final Report.” (see sample survey on page 12)
Plum, T., Franklin, B., Kyrillidou, M., Roebuck, G., & Davis, M. (2010). Measuring the Impact of Networked Electronic Resources. Performance Measurement and Metrics, 11(2).
Villaespesa, E. & Stack, J. (2015, January 30) Finding the motivation behind a click: Definition and implementation of a website audience segmentation. MW2015: Museums and the Web 2015.
Fielding, N. G., Raymond M. L., & Blank G., eds. (2017). The SAGE Handbook of Online Research Methods. London: SAGE Publications.
Toepoel, V. (2016). Introduction to online surveys. In Doing Surveys Online, pp. 1-18. London: SAGE Publications Ltd.
Contributors to this page include Joyce Chapman and Harriett Green.