Focus Groups


A small, guided group discussion to elicit in-depth information with users who meet certain demographic or experiential criteria. To assess digital object use/reuse, a focus group would likely consist of people who have used or reused the organization’s digital objects, but focus groups may also be undertaken with users who have attempted digital object use/reuse, but were stymied, or users who have contemplated digital object use/reuse, but have not yet acted.

Applications for assessing digital content use/reuse

Focus groups allow practitioners to gain a deep understanding of user needs, experiences, beliefs, and attitudes, and discuss topics in detail in a small group setting. Focus groups can also be used to put the results of quantitative data into a deeper context, gather feedback on tentative conclusions from other studies, or test out future scenarios for services/systems. They typically include 5-10 people, are facilitated by a moderator, and may be witnessed by a recorder, notetaker, or other observer. The moderator’s role is to guide participants through a series of discussion topics during a 60-90 minute session, which can be held in-person or virtually. Similar to interviews, focus groups are a qualitative research method. Structured focus groups follow a script that is written ahead of time and contains the questions to be discussed, as well as probes for follow up questions; unstructured focused groups may initially follow a script but also may depart from that script to probe new areas that arise. The moderator may be a trained facilitator who can help participants feel comfortable, ask unbiased questions, and probe further when follow up questions are needed. Ideally, the moderator should not “represent” the organization or system that is the topic of discussion, so that participants feel free to express a range of reactions in discussion. If a cultural heritage institution does not have access to trained and unaffiliated individuals, it is recommended that those staff or volunteers who are conducting focus groups familiarize themselves with best practices for focus group moderation (see supplemental materials). 

Focus groups are often recorded. Recording may include video and audio, audio only, or notes taken by an observer. The participants consent must be acquired before doing so. Purposes of recording a focus group may include: allowing the moderator to lend their full attention to the focus group itself instead of taking notes if a notetaker is not available, capturing verbal or non-verbal cues, transcribing discussion verbatim to avoid omissions, validating notes taken by an observer, enabling multiple listeners to check reliability, and decreasing the likelihood of interviewer bias. 

Supplemental materials

Ethical guidelines

Practitioners should follow the practices laid out in the “Ethical considerations and guidelines for the assessment of use and reuse of digital content.” The Guidelines are meant both to inform practitioners in their decision-making, and to model for users what they can expect from those who steward digital collections.

Additional guidelines for responsible practice

If audio or video of the session is being recorded, consent should be procured from each participant via a signed consent form. The form should explain the purpose of the recording, how privacy will be protected, the extent to which the recording will be shared, and plans for retention and disposal of the recording.

When taking notes or transcribing a session, include only the minimal personally identifying information necessary, particularly if files will be shared with other staff or are accessible via shared computer drives. It is not necessary to include a participant’s name on a transcript. Instead, each participant can be assigned an identifier that is used in notes. If necessary the identifiers can be associated with other information about the participants in a separate, secure document.

Practitioners who are not trained in qualitative data collection methods should become acquainted with the concept of interviewer bias and how to avoid it before conducting focus groups. 

Depending on the type of institution at which a practitioner works, practitioners may need to contact the Institutional Review Board (IRB) to determine whether an IRB application is needed for research with human subjects. An IRB application may not be needed if the goal of the human subject research is for program improvements, and/or if findings are not intended to be generalizable or published.


  • Unlike most quantitative data collection methods, focus groups and interviews offer the opportunity to seek clarification and ask follow up questions to better understand users’ behavior around use/reuse of digital objects.

  • Focus groups and interviews can contribute rich anecdotal data and a human aspect to flesh out quantitative numbers. This may help an organization in storytelling around the impact of use/reuse of digital collections.

  • Conducting focus groups and interviews does not require software or specialized technology.

  • Focus groups and interviews often produce unexpected information, provided that the script/questions provide flexibility for a diversity of opinion to be voiced and heard.

  • Compared to individual interviews, focus groups allow participants to discuss and react to issues raised by other participants.


  • Focus groups and interviews are not a useful way to attempt to gather comprehensive data on the number of instances of use/reuse of digital objects or the ways in which the majority of such users reuse materials. Focus groups will gather detailed experiential data from a few individual users.

  • Practitioners may need additional training to become well-versed in best practices for focus group moderation.

  • Analysis of qualitative data such as that created by focus groups can be time intensive, particularly if practitioners plan to create a full transcript of a recorded session, or code the qualitative data.

  • It can be difficult to identify people to invite to participate in focus groups or individual interviews on the assessment of use/reuse of digital objects because digital object use is often anonymous.

  • Focus group discussion can sometimes be dominated by a few participants; others may be silenced. A moderator should be prepared with techniques to mitigate this dynamic.

  • Asking users to describe their own behavior can provide less accurate data than methods that actually track user behavior (web analytics, citation analysis). Users may not remember or may not  accurately report their own behavior. This can be mitigated by using critical incident framing (“Tell me about a time when X. What did you do?”); however, focus groups inherently focus on what participants say they did/felt/thought, which may not be what they actually did/felt/thought.

Learn how practitioners have used this method

Conducting focus groups for the Developing a Framework for Measuring Reuse of Digital Objects project
As part of the “Measuring Reuse” IMLS-funded grant project that laid the foundation for the development of this Assessment Toolkit, the project team spent a year gathering feedback from the digital library community via surveys and focus groups to identify the needs and ideal functionality of a digital object reuse assessment toolkit. In 2017 and 2018 the group conducted three rounds of focus groups on 1. The benefits, challenges, and barriers of assessing reuse; 2. The technology and standards currently used to conduct such assessment; and 3. Privacy concerns and the cultural and ethical implications of gathering reuse data. Both in-person and virtual focus groups were conducted for each of the three rounds. The white paper includes focus group methodology and supporting documents in the appendices. 

Kelly, E.J., Stein Kenfield, A., Muglia, C., O’Gara, G., Thompson, T., & Woolcott, L. (2018). Setting a Foundation for Assessing Content Reuse: A White Paper From the Developing a Framework for Measuring Reuse of Digital Objects project.

Additional resources

Archer, T. M. (2007). Using Guidelines To Support Quality Moderation of Focus Group Interviews. Mid-Western Educational Researcher, 20(1), 38–41.
Brophy, P. (2006). Measuring Library Performance: Principles and Techniques. Facet. 
Covey, D. T. (2011). Recruiting Content for the Institutional Repository: The Barriers Exceed the Benefits. Journal of Digital Information, 12(3), Article 3. 
Hughes, L. M., Ell, P. S., Knight, G. A. G., & Dobreva, M. (2015). Assessing and measuring impact of a digital collection in the humanities: An analysis of the SPHERE (Stormont Parliamentary Hansards: Embedded in Research and Education) Project. Digital Scholarship in the Humanities, 30(2), 183–198. 
Kenfield, A. S., Kelly, E. J., Muglia, C., O’Gara, G., Thompson, S., & Woolcott, L. (2019). Measuring Reuse of Institutionally-Hosted Grey Literature
Marsh, D. E., Punzalan, R. L., Leopold, R., Butler, B., & Petrozzi, M. (2016). Stories of impact: The role of narrative in understanding the value and impact of digital collections. Archival Science, 16(4), 327–372. 
Meyer, E. T. (2011). Splashes and Ripples: Synthesizing the Evidence on the Impacts of Digital Resources (SSRN Scholarly Paper No. 1846535). 
Meyer, E., Eccles, K., Thelwall, M., & Madsen, C. (2009). Focus Groups | TIDSR: Toolkit for the Impact of Digitised Scholarly Resources. Final report to JISC on the usage and impact study of JISC-funded phase 1 digitisation projects and the toolkit for the impact of digitised scholarly resources (TIDSR). Oxford Internet Institute. Full report is avaiable at 
Punzalan, R. L., Marsh, D. E., & Cools, K. (2017). Beyond Clicks, Likes, and Downloads: Identifying Meaningful Impacts for Digitized Ethnographic Archives. Archivaria, 84(0), 61–102. 
Schwarz, B. B., & Asterhan, C. S. (2011). E-Moderation of Synchronous Discussions in Educational Settings: A Nascent Practice. Journal of the Learning Sciences, 20(3), 395–442. 
Tanner, S. (2012). Measuring the Impa King’s College Londonct of Digital Resources: The Balanced Value Impact Model. King’s College London. 
U.S. Department of Health and Human Services. (2006). Focus


Contributors to this page include Joyce Chapman and Megan Oakleaf.

Cite this page

Chapman, J., Oakleaf, M. (2023). Focus Groups. Digital Content Reuse Assessment Framework Toolkit (D-CRAFT); Council on Library & Information Resources.

Skip to content