No internet connection.

All search filters on the page have been cleared., your search has been saved..

  • All content
  • Dictionaries
  • Encyclopedias
  • Expert Insights
  • Foundations
  • How-to Guides
  • Journal Articles
  • Little Blue Books
  • Little Green Books
  • Project Planner
  • Tools Directory
  • Sign in to my profile No Name

Not Logged In

  • Sign in Signed in
  • My profile No Name

Not Logged In

Document Review as a Qualitative Research Data Collection Method for Teacher Research

  • By: Pamela J. Bretschneider , Stefanie Cirilli , Tracey Jones , Shannon Lynch & Natalie Ann Wilson
  • Product: Sage Research Methods Cases Part 2
  • Publisher: SAGE Publications Ltd
  • Publication year: 2017
  • Online pub date: October 27, 2016
  • Discipline: Education , Teacher Resources
  • Methods: Documentary research , Qualitative data collection , Research questions
  • DOI: https:// doi. org/10.4135/9781473957435
  • Keywords: autism spectrum disorders , classrooms , instructional strategies , National Institute of Child Health and Human Development , students , teaching Show all Show less
  • Type: Indirect case info Sub Discipline: Teacher Resources Online ISBN: 9781473957435 Copyright: Contact SAGE Publications at http://www.sagepub.com . More information Less information

This case study is an introduction to the use of a document checklist as part of a qualitative research method of document review. Details are provided on the types of documents that can be utilized in a document review; the advantages and disadvantages of using document review as a research data collection method; description of how to design, implement, analyze, and present documents used to answer a research question; and four examples of research-based document reviews in educational settings.

Learning Outcomes

By the end of this case, students should be able to

  • Develop understanding of the importance of teacher research in advancing knowledge to improve classroom instruction
  • Explain how document review can be used as a data collection method in answering a research question
  • Examine recent, student-conducted research using document review as a data collection method
  • Describe the various documents that can be used in a document review
  • Learn how to design, implement, analyze, and present data collected through a document checklist
  • Distinguish between document (documentary) review and a literature review (review of the literature)

Qualitative Research

Qualitative research spans many different disciplines and many different approaches to answering research questions. Qualitative research designs are gaining in perceived research value in many disciplines, including medicine, psychology, and education. It is becoming more common and extremely useful in studying Kindergarten through Grade 12 (K-12) schools in keeping up to date with best practices, strategy development, and advancing professional practice. The process of qualitative research can generate results that enhance understanding of an issue, problem, practice, or phenomenon, on both a small and large scale. It can be used for curriculum development, teacher professional development, collaborative problem solving, administrative systems planning, strategy development, and many other purposes.

The purpose of this case study is to share the use of document review as a qualitative data collection strategy in teacher research and to increase understanding of the unique characteristics of document review as a method of data collection. Document review can be used as a tool in mixed method research designs, also, especially when a quantitative method design alone cannot fully complete an investigation.

Teacher Research

The results of teacher research can ‘recalibrate their pedagogy and their understanding of their own work’ (Cazden, Diamonston, & NASO, as cited in Anderson, Herr, & Nihlen, 1994, p. 154). Teachers are the front line in classrooms today; thus, it is logical that input from them is crucial to understanding students, implementing best practices, and improving the everyday practice of teaching. Mohr et al. (2004) suggested the purpose of teachers conducting research is to seek knowledge and to be able to immediately use that information in the classrooms in which they teach.

Donna Phillips and Philip Carr (2014) described a snapshot of a student teacher who sought to investigate a claim from a new reading series publisher that one of the benefits of the program was increasing reading comprehension. She used document review to collect best practice research studies using a rubric she developed on which she recorded essential characteristics of teaching reading comprehension. Through her document analysis, she discovered strengths and weaknesses that needed to be addressed by curriculum changes. Thus, teachers can utilize results that can then be interwoven into their own classrooms. They can conduct research as a team or solve a problem alone and make a decision on what to implement in just one classroom. Research questions developed by teachers enable them to ‘relate particular issues to theories of teaching and learning via documentation and analysis; hence, teacher research links theory with practice’ (Bullough & Gitlin, as cited in Henderson, Meier, Perry, & Stremmel, 2012, p. 3). Essentially, school-wide research can improve teaching and learning. It can be designed to develop a community of practice, where teachers, administrators, and students learn from research and the power and expertise of each other.

Document Review as Compared to Literature Review

It is helpful when considering document review as a data collection method to compare it to a traditional literature review, a general component of all research projects. Essentially, a literature review summarizes the published literature, ‘a critical appraisal of other research on a given topic that helps to put that topic in context’ (Machi & McEvoy, 2012, p. 2). A scholarly literature review is logically organized, unbiased, comprehensive, and current.

The primary difference between a literature review and document review (documentary research) is that

doing a literature review as a vital component of an overall piece of research and conducting a full-scale piece of documentary research is that the latter usually involves a range of specialized techniques of analysis, interpretation, and data handling that are not typically used in literature reviews (Lankshear & Knobel, 2004, p. 58).

These processes are different, in that a literature review is used to investigate knowledge published from primary and secondary sources. Documentary research refers to many different types of documents (see Table 1 ) and is used as primary research data. As these researchers explained, ‘The difference is that a literature review is part of a larger project that is intended to inform. General forms of standalone, document-based research, by contrast, are studies in their own right’ (p. 6). Lankshear and Knobel suggested that document-based research requires not only general skills and techniques like those used in writing a literature review but also additional skills that are specialized for this process.

Document Review

Document review is a systematic collection, documentation, analysis and interpretation, and organization of data as a data collection method in research. There are many types of documents that can be used in a document review (see Table 1 ). The documents may be external or internal to an organization, such as a school, organization, company, government agency, and others and can be in hard copy or electronic form.

An example of the use of a document review outside of educational documents is a study conducted by the Government of Canada in 2011. The purpose of the investigation was to study hate-motivated crime in the country. The following data collection methods were used: stakeholder interviews, review of administrative data, survey of police services, and a document review. The purpose of using document review as a method was to ‘inform an assessment of the relevance of the Data Collection Strategy and to determine whether any feasible alternatives exist’ (Government of Canada, 2011, p. 3). Their list of documents used was extensive and was presented in a four-page Appendix in which they listed specific documents in the following categories: corporate, accountability, and political documents; program materials; hate crime reports; academic papers; and international reports (see Appendix E of this source).

Document review results in information and insight into the research question and to the practice of teaching. Utilizing document review as a method can result in evidence-based guidelines and best practices, as it provides a useful contribution to qualitative research designs in teacher research. It can be valuable in collecting research and developing insight related to special populations. All methods have their particular strengths; thus, the choice of any method, including document review, must be consistent with the research question and yield substantive data in which to answer the research question, often serving as a starting point in conjunction with other methods to triangulate data (using different methods to collect data in the same study). This reduces the risk of bias and increases understanding from different perspectives. For instance, in the research study noted above that investigated the strengths and weaknesses of a new reading series the school had adopted, the teacher (Courtney) used three methods of collecting data. Her first method used to collect data was a review of best practices from which she determined a set of characteristics essential for learning materials to teach reading. This informed her second method, focus groups with teachers, and her third method, monitoring student progress using the new series. ‘Courtney’s work became a powerful aid when shared with others using the same or similar curriculum’ (Phillips & Carr, 2014, p. 21).

Often, documents that can be collected already exist, such as school records, curriculum units, public records, or teacher notes. If the documents provide first-hand information, they are primary sources which mean the document is written in first person by someone who had direct experience with the document, the organization, or the phenomenon. If you seek to use documents from an organization, such as a school, make sure you seek approval from the administration of the organization that holds them. To assure you follow research ethics, if you use any document in its entirety, seek written permission to do so; if you quote from any document, make sure you cite properly.

The Internet and research-based databases provide a plethora of documents gathered from a vast number of studies conducted, more far reaching, including geographically, than any one individual study. Together, through a process of synthesis (combining all documents to identify and interpret their contents), this information can inform and support professional practice. Before embarking on a document review, develop a list of search terms through which you will locate documents. School library staff is of valuable assistance in guiding your parameters for searches and recommending particular databases; they are sleuths in locating sources and should be tapped for their researching skills. People tend to search the Internet by topic, but much can be learned by searching by particular authors/researchers known in the subject area. Utilizing the sources cited in other sources is also a rich source of documents and other studies. Types of documents that can be collected depending upon the research needs are listed in Table 1 . The advantages (see Table 2 ) and disadvantages (see Table 3 ) of using document review as a data collection method are provided to assist researchers in noting benefits and drawbacks of this method of data collection.

Criterion for Developing a Document Review Checklist

One does not just search for documents without establishing guidelines for data collection. Caution should be exercised in assessing Internet-based documents to include in a document checklist (see samples below). Develop a criterion for judging where to seek documents, whether you have access and permission to use documents, how to judge the relevance and value of documents, and when to stop looking. As with a literature review, if using a document checklist to utilize as a method of gathering documents (e.g. curriculum guides and lists of best practices), it is good advisable to limit to documents located in peer-reviewed journals.

Choosing Documents for Inclusion

Do not choose documents based on the ease of obtaining them; instead, choose documents that, collectively, will assist the researcher in answering the research question. When choosing documents for inclusion, adhere to your set guidelines for selection determination:

  • do you have permission to use the document? (written permission is advisable for a paper trail)
  • what is ‘out there’ and are the documents readily available for review?
  • is the document relevant in answering the research question?
  • where did the document come from; what is the history of the document?
  • what is the source of the document?
  • for what audience was the document intended?
  • is it a reliable and genuine source?
  • is the information contained in the document consistent?
  • how current is the document? What is the date? Is it recent enough to be valuable?
  • who are the authors and what are their credentials and experience?
  • is the document a result of a structured and systematic process?
  • will the document add to understanding your topic or question? Is the document appropriate to the issue?
  • does the document contain unique characteristics not yet noted in other documents you select?
  • is the document clear, complete, and readable?

It is advisable if you use Internet sources to obtain documents to:

  • vary the sources you use; do not limit to a small group of journals;
  • be selective; scrutinize all documents under consideration for inclusion, using a pre-determined set of criteria appropriate to the research question;
  • carefully note all content and the URL of the document you located (to include in a references list and for author and reader access at a later date). Save these links in an electronic folder on your computer;
  • set a time limit on when data collection will stop (or you will keep going indefinitely!) or set a certain number of documents. Too many will overwhelm you and the reader and detract from those documents you have determined are of great value to answering the research question;
  • develop a preliminary form through which to record documents, but know this will be an evolution, as the data will drive your form and edits will need to be made. Begin with a sample, such as the following, and adjust for your data needs. There are many blank templates for document review available on the Internet (also, see Appendix C, Document Analysis Template, Mayan (2009, pp. 147-149).

Once You’ve Collected the Documents

Now what do you do with all you have collected? How do you make sense of it all to present in a narrative? Sometimes the data you collect are vast; the dilemma arises on how to choose the ones to utilize, how to organize them, and how to present them. This process requires time, effort, re-organizing, and editing before the final presentation in narrative form:

  • print out those documents that you have determined to use; yes, it seems like an old method, but using color coding in hard copy can prove less cumbersome than completing that task on the computer, with various screens; it provides ease in data analysis of the documents;
  • collect like documents together, compare, analyze; then compare to other documents collectively;
  • organize the data in such a way that makes sense and is useful to your process; then identify themes or trends that emerge as you make sense of all you have collected (data analysis);
  • note any specific language used across the documents (such as technical and informal), note inconsistencies, note authorship, and note dates;
  • use colored markers for major themes that emerge from various sources; this again will aid in ‘writing up’ results of your collection and presenting them in a logical order;
  • stay organized!
  • complete your checklist, using a system such as alphabetizing by author or by theme (see samples presented later in this case study);
  • importantly, maintain confidentiality of all data you collect.

Presenting the Documents

You will ‘write up’ your findings from your document review in narrative form. Your completed document checklist will be your guide:

  • present the exact checklist you developed, completed with your document information inserted in the relevant columns,
  • present to the reader a narrative of what you sought and the significance of what you found,
  • refer to your completed checklist as an appendix,
  • essentially your narrative should accurately and appropriately present your findings.

Samples from Actual Studies

The following four samples will provide a deeper understanding of why the document review was chosen, how the document review was designed and conducted, partial components of the actual review checklist, and summaries of the significance of the data collected. Each of these samples was part of dissertation research:

Sample 1. What are Teacher Practices and Resources for Preparing Teachers to Assist Students with Autism Spectrum Disorder in Inclusive, Primary Classrooms? (Stefanie Cirilli, 2014)

The researcher conducted a qualitative, phenomenological research study that employed a document review, semi-structured interviews, and a questionnaire. From the document review sources, the researcher compiled strategies for organizing and structuring an inclusive, primary classroom for students with autism spectrum disorder (ASD).

The researcher conducted a document review to identify what specific strategies existed for organizing and structuring an inclusive classroom to accommodate students with ASD. The document review was a general, exhaustive review and included books, articles, prior studies, and professional websites. A variety of sources were examined to determine methods utilized specifically for classroom organization, such as peer-reviewed articles, professional websites, and books on the inclusion of students with ASD. The researcher compiled specific strategies for preparing and organizing an inclusive classroom during the document review so that general education teachers may have a variety of strategies to choose from, depending on their students’ needs. The resulting list consists of specific strategies that may be helpful for preparing an inclusive classroom to accommodate students with ASD.

The document review allowed for a reasonably large and efficient data collection, since documents were located in hard copies and Internet versions. The researcher searched a vast number of sources in order to compile a detailed list of recommended strategies for organizing inclusive classrooms for students with ASD. The resulting list may be utilized as an additional resource for general education teachers when preparing their classrooms to teach students with ASD.

Document Review Form

Based on the supporting research in the literature review, the researcher developed the criterion for judging each document and recorded specific information from the review onto the form.

The document review instrument was arranged in a table format. Each row included the title of the document, whether or not the document was peer reviewed, strategies found in the document for organizing an inclusive classroom, what resources or materials are necessary to implement the strategies, and a description of the strategies. A full reference list was also included after the table so that readers could easily retrieve each document. The revision of this form at the beginning of data collection allowed for more efficient recording of information. The resulting document review form provided a succinct and organized visual for recording the data. The researcher only considered documents from the past 10 years for review so that the strategies listed would be current.

Document Review Process

The first step of the document review process was to search for documents online as well as in hard copy resources such as books and journals. The researcher searched for these documents using Google Scholar and the Jones eGlobal Library. Specifically, EBSCO databases and Google Scholar were searched using Boolean combinations of these key words: autism, autism spectrum disorder, classroom organization, classroom structure, inclusion, primary classroom, and elementary classroom. For instance, combinations such as ‘autism AND classroom organization’ were utilized. Additionally, the researcher searched some professional websites that are associated with autism research: Autism Speaks, Autism Society of America, Autism Research Institute, and the Council for Exceptional Children. Amazon and the local public library were utilized to locate hard copies of books about ASD and inclusion.

The researcher located each document and decided whether the document was useful to the study—whether the document contained strategies specifically for organizing and structuring an inclusive environment for students with ASD in an elementary classroom. Only documents that were within the past 10 years were considered. If a document met these initial criteria, the researcher further examined the document and marked strategies with Post-it notes. Documents were read thoroughly and recorded the information onto the document review form. Data reached saturation when redundancy occurred in citations. Content analysis followed.

ASD: autism spectrum disorder.

A summary of results was then developed and placed in a table that resulted in a list of strategies for organizing and structuring an inclusive, primary classroom for students with ASD. Document Review References were then provided. The following is an abbreviated list of results.

Note: ASD = autism spectrum disorder.

The researcher successfully planned and conducted a qualitative research study regarding ASD and inclusion, gathered a considerable amount of information to help improve teacher practices and resources for the inclusion of students with ASD, and identified strategies for inclusion and recommended resources to accommodate students with ASD in general education classrooms. With the rise in ASD and implications of Individuals with Disabilities Education Improvement Act (IDEA (2004)), this information may be helpful to educators and policymakers as they provide students with ASD a free and appropriate public education in the least restrictive environment.

Sample 2. How Can Schools Effectively Meet the Needs of Students with Reading Comprehension Difficulties in Grade 3? (Tracey Jones, 2013)

This qualitative research design included administering student questionnaires, conducting interviews with one resource teacher and one principal, as well as conducting a document review. Prior to undertaking this effort, the researcher defined certain criterion for inclusion in this document review. The search was limited to documents published within the last 5 years.

This document review explored the research surrounding effective strategies and programs used for students with reading comprehension difficulties. This included research-based programs, such as corrective reading, reading mastery, guided reading, and various multiple-strategy approaches. A list of these resources as a result of this review was developed and shared with various groups, including teachers, administrators, parents, and specialists for their use as they work with students with their reading comprehension skills.

Sample 3. What Instructional Writing Strategies Are Available For Teacher Use To Increase Writing Success For Students In Grades 1-4? (Shannon Lynch, 2015)

Research methods and instrumentation.

Qualitative data will be collected and analyzed using interviews, questionnaires with open-ended questions, and a compilation of instructional strategies available to teachers by using document review to investigate recent research studies that resulted in strategies for teaching writing. They will be compiled into tables, organized by grade level.

Document Review Checklists

The resulting list of strategies will be presented at the conclusion of the study as a result of the document review and will be shared with current teachers with the aim of adding to their list of teaching tools to assist students in advancing their writing skills, as it will provide teachers with current research-based strategies with which they may not have been familiar previously.

Sample 4. What Are Michigan Teachers’ Reactions to Value-Added Teacher Evaluation Systems? (Natalie Ann Wilson, 2014)

Questionnaires, interviews, and a legislative document analysis were used as data collection strategies to illustrate the experiences of teachers who work at economically disadvantaged schools who must demonstrate their effectiveness under the new legislation in the State of Michigan, United States. A legislative checklist was utilized to analyze the amendments to the Revised School Code and Teacher’s Tenure Act. Information in the checklist included (a) how and why the laws were proposed, (b) what the mandates require, and (c) who they affect. In conducting the legislative analysis, the most significant changes in the evaluation process were highlighted. Describing the evaluation documents and the related laws helped contextualize the changes that have taken place and the subsequent changes in teachers’ practices. Analysis of legislative document illustrated the nature of changes that took place and provide context for the decisions and reactions that teachers are experiencing because of the changes.

The checklists include an analysis of the legislative language in the Revised School Code and the Teacher’s Tenure Act, U.S. state government–regulated document. The analysis was conducted to illustrate the historical background of the laws and their mandates, as well as providing a context for the changes that are taking place at the study site. A checklist was developed and utilized to analyze the language of the amendments to the Revised School Code (2011) and Teacher’s Tenure Act (2011), which pertained most closely to the teacher evaluation system and tenure process. The information consisted of the history of the legislation, its mandates, funding, and processes for reporting. Using the information in the document checklist helped provide a background and context of the new mandates each district has to follow. The document checklist was used as both a data collection and analysis tool because the descriptions contained information that helped further understanding of the research question ( Figure 1 ).

Figure 1. Legislative checklist for the Revised School Code.

Figure 1. Legislative checklist for the Revised School Code.

Note: K-12 = Kindergarten through Grade 12.

Exercises and Discussion Questions

  • 1. What questions do you have in your classrooms that you’d like to investigate?
  • 2. How might you answer the question through the collection of documents, and what type of documents might be available to you to collect, review, and analyze to assist you in answering your research question?
  • 3. What criteria would you use in seeking the documents needed for your review? Would you need and from whom would you seek permission to use the documents?
  • 4. Develop a basic checklist through which to record the documents you might collect.

Further Readings

Web resources, sign in to access this content, get a 30 day free trial, more like this, sage recommends.

We found other relevant content for you on other Sage platforms.

Have you created a personal profile? Login or create a profile so that you can save clips, playlists and searches

  • Sign in/register

Navigating away from this page will delete your results

Please save your results to "My Self-Assessments" in your profile before navigating away from this page.

Sign in to my profile

Sign up for a free trial and experience all Sage Learning Resources have to offer.

You must have a valid academic email address to sign up.

Get off-campus access

  • View or download all content my institution has access to.

Sign up for a free trial and experience all Sage Research Methods has to offer.

  • view my profile
  • view my lists

what is document review in qualitative research

Document Analysis - How to Analyze Text Data for Research

what is document review in qualitative research

Introduction

What is document analysis, where is document analysis used, how to perform document analysis, what is text analysis, atlas.ti as text analysis software.

In qualitative research , you can collect primary data through surveys , observations , or interviews , to name a few examples. In addition, you can rely on document analysis when the data already exists in secondary sources like books, public reports, or other archival records that are relevant to your research inquiry.

In this article, we will look at the role of document analysis, the relationship between document analysis and text analysis, and how text analysis software like ATLAS.ti can help you conduct qualitative research.

what is document review in qualitative research

Document analysis is a systematic procedure used in qualitative research to review and interpret the information embedded in written materials. These materials, often referred to as “documents,” can encompass a wide range of physical and digital sources, such as newspapers, diaries, letters, policy documents, contracts, reports, transcripts, and many others.

At its core, document analysis involves critically examining these sources to gather insightful data and understand the context in which they were created. Research can perform sentiment analysis , text mining, and text categorization, to name a few methods. The goal is not just to derive facts from the documents, but also to understand the underlying nuances, motivations, and perspectives that they represent. For instance, a historical researcher may examine old letters not just to get a chronological account of events, but also to understand the emotions, beliefs, and values of people during that era.

Benefits of document analysis

There are several advantages to using document analysis in research:

  • Authenticity : Since documents are typically created for purposes other than research, they can offer an unobtrusive and genuine insight into the topic at hand, without the potential biases introduced by direct observation or interviews.
  • Availability : Documents, especially those in the public domain, are widely accessible, making it easier for researchers to source information.
  • Cost-effectiveness : As these documents already exist, researchers can save time and resources compared to other data collection methods.

However, document analysis is not without challenges. One must ensure the documents are authentic and reliable. Furthermore, the researcher must be adept at discerning between objective facts and subjective interpretations present in the document.

Document analysis is a versatile method in qualitative research that offers a lens into the intricate layers of meaning, context, and perspective found within textual materials. Through careful and systematic examination, it unveils the richness and depth of the information housed in documents, providing a unique dimension to research findings.

what is document review in qualitative research

Document analysis is employed in a myriad of sectors, serving various purposes to generate actionable insights. Whether it's understanding customer sentiments or gleaning insights from historical records, this method offers valuable information. Here are some examples of how document analysis is applied.

Analyzing surveys and their responses

A common use of document analysis in the business world revolves around customer surveys . These surveys are designed to collect data on the customer experience, seeking to understand how products or services meet or fall short of customer expectations.

By analyzing customer survey responses , companies can identify areas of improvement, gauge satisfaction levels, and make informed decisions to enhance the customer experience. Even if customer service teams designed a survey for a specific purpose, text analytics of the responses can focus on different angles to gather insights for new research questions.

Examining customer feedback through social media posts

In today's digital age, social media is a goldmine of customer feedback. Customers frequently share their experiences, both positive and negative, on platforms like Twitter, Facebook, and Instagram.

Through document analysis of social media posts, companies can get a real-time pulse of their customer sentiments. This not only helps in immediate issue resolution but also in shaping product or service strategies to align with customer preferences.

Interpreting customer support tickets

Another rich source of data is customer support tickets. These tickets often contain detailed descriptions of issues faced by customers, their frustrations, or sometimes their appreciation for assistance received.

By employing document analysis on these tickets, businesses can detect patterns, identify recurring issues, and work towards streamlining their support processes. This ensures a smoother and more satisfying customer experience.

Historical research and social studies

Beyond the world of business, document analysis plays a pivotal role in historical and social research. Scholars analyze old manuscripts, letters, and other archival materials to construct a narrative of past events, cultures, and civilizations.

As a result, document analysis is an ideal method for historical research since generating new data is less feasible than turning to existing sources for analysis. Researchers can not only examine historical narratives but also how those narratives were constructed in their own time.

what is document review in qualitative research

Turn to ATLAS.ti for your data analysis needs

Try out our powerful data analysis tools with a free trial to make the most out of your data today.

Performing document analysis is a structured process that ensures researchers can derive meaningful, qualitative insights by organizing source material into structured data . Here's a brief outline of the process:

  • Define the research question
  • Choose relevant documents
  • Prepare and organize the documents
  • Begin initial review and coding
  • Analyze and interpret the data
  • Present findings and draw conclusions

The process in detail

Before diving into the documents, it's crucial to have a clear research question or objective. This serves as the foundation for the entire analysis and guides the selection and review of documents. A well-defined question will focus the research, ensuring that the document analysis is targeted and relevant.

The next step is to identify and select documents that align with the research question. It's vital to ensure that these documents are credible, reliable, and pertinent to the research inquiry. The chosen materials can vary from official reports, personal diaries, to digital resources like social media data , depending on the nature of the research.

Once the documents are selected, they need to be organized in a manner that facilitates smooth analysis. This could mean categorizing documents by themes, chronology, or source types. Digital tools and data analysis software , such as ATLAS.ti, can assist in this phase, making the organization more efficient and helping researchers locate specific data when needed.

what is document review in qualitative research

With everything in place, the researcher starts an initial review of the documents. During this phase, the emphasis is on identifying patterns, themes, or specific information relevant to the research question.

Coding involves assigning labels or tags to sections of the text to categorize the information. This step is iterative, and codes can be refined as the researcher delves deeper.

After coding, interesting patterns across codes can be analyzed. Here, researchers seek to draw meaningful connections between codes, identify overarching themes, and interpret the data in the context of the research question .

This is where the hidden insights and deeper understanding emerge, as researchers juxtapose various pieces of information and infer meaning from them.

Finally, after the intensive process of document analysis, the researcher consolidates their findings, crafting a narrative or report that presents the results. This might also involve visual representations like charts or graphs, especially when demonstrating patterns or trends.

Drawing conclusions involves synthesizing the insights gained from the analysis and offering answers or perspectives in relation to the original research question.

Ultimately, document analysis is a meticulous and iterative procedure. But with a clear plan and systematic approach, it becomes a potent tool in the researcher's arsenal, allowing them to uncover profound insights from textual data.

what is document review in qualitative research

Text analysis, often referenced alongside document analysis, is a method that focuses on extracting meaningful information from textual data. While document analysis revolves around reviewing and interpreting data from various sources, text analysis hones in on the intricate details within these documents, enabling a deeper understanding. Both these methods are vital in fields such as linguistics, literature, social sciences, and business analytics.

In the context of document analysis, text analysis emerges as a nuanced exploration of the textual content. After documents have been sourced, be it from books, articles, social networks, or any other medium, they undergo a preprocessing phase. Here, irrelevant information is eliminated, errors are rectified, and the text may be translated or converted to ensure uniformity.

This cleaned text is then tokenized into smaller units like words or phrases, facilitating a granular review. Techniques specific to text analysis, such as topic modeling to determine discussed subjects or pattern recognition to identify trends, are applied.

The derived insights can be visualized using tools like graphs or charts, offering a clearer understanding of the content's depth. Interpretation follows, allowing researchers to draw actionable insights or theoretical conclusions based on both the broader document context and the specific text analysis.

Merging text analysis with document analysis presents unique challenges. With the proliferation of digital content, managing vast data sets becomes a significant hurdle. The inherent variability of language, laden with cultural nuances, idioms, and sometimes sarcasm, can make precise interpretation elusive.

Many text analysis tools exist that can facilitate the analytical process. ATLAS.ti offers a well-rounded, useful solution as a text analytics software . In this section, we'll highlight some of the tools that can help you conduct document analysis.

Word Frequencies

A word cloud can be a powerful text analytics tool to understand the nature of human language as it pertains to a particular context. Researchers can perform text mining on their unstructured text data to get a sense of what is being discussed. The Word Frequencies tool can also parse out specific parts of speech, facilitating more granular text extraction.

what is document review in qualitative research

Sentiment Analysis

The Sentiment Analysis tool employs natural language processing (NLP) and machine learning to analyze text based on sentiment and facilitate natural language understanding. This is important for tasks such as, for example, analyzing customer reviews and assessing customer satisfaction, because you can quickly categorize large numbers of customer data records by their positive or negative sentiment.

AI Coding relies on massive amounts of training data to interpret text and automatically code large amounts of qualitative data. Rather than read each and every document line by line, you can turn to AI Coding to process your data and devote time to the more essential tasks of analysis such as critical reflection and interpretation.

These text analytics tools can be a powerful complement to research. When you're conducting document analysis to understand the meaning of text, AI Coding can help with providing a code structure or organization of data that helps to identify deeper insights.

what is document review in qualitative research

AI Summaries

Dealing with large numbers of discrete documents can be a daunting task if done manually, especially if each document in your data set is lengthy and complicated. Simplifying the meaning of documents down to their essential insights can help researchers identify patterns in the data.

AI Summaries fills this role by using natural language processing algorithms to simplify data to its salient points. Text generated by AI Summaries are stored in memos attached to documents to illustrate pathways to coding and analysis or to highlight how the data conveys meaning.

Take advantage of ATLAS.ti's analysis tools with a free trial

Let our powerful data analysis interface make the most out of your data. Download a free trial today.

what is document review in qualitative research

  • Search Menu
  • Advance Articles
  • Editor's Choice
  • Supplements
  • Open Access Articles
  • Author Guidelines
  • Submission Site
  • Open Access Options
  • Self-Archiving Policy
  • About Health Policy and Planning
  • About the London School of Hygiene and Tropical Medicine
  • HPP at a glance
  • Editorial Board
  • Advertising and Corporate Services
  • Journals Career Network
  • Journals on Oxford Academic
  • Books on Oxford Academic

Issue Cover

Article Contents

Introduction, what is document analysis, the read approach, supplementary data, acknowledgements.

  • < Previous

Document analysis in health policy research: the READ approach

ORCID logo

  • Article contents
  • Figures & tables

Sarah L Dalglish, Hina Khalid, Shannon A McMahon, Document analysis in health policy research: the READ approach, Health Policy and Planning , Volume 35, Issue 10, December 2020, Pages 1424–1431, https://doi.org/10.1093/heapol/czaa064

  • Permissions Icon Permissions

Document analysis is one of the most commonly used and powerful methods in health policy research. While existing qualitative research manuals offer direction for conducting document analysis, there has been little specific discussion about how to use this method to understand and analyse health policy. Drawing on guidance from other disciplines and our own research experience, we present a systematic approach for document analysis in health policy research called the READ approach: (1) ready your materials, (2) extract data, (3) analyse data and (4) distil your findings. We provide practical advice on each step, with consideration of epistemological and theoretical issues such as the socially constructed nature of documents and their role in modern bureaucracies. We provide examples of document analysis from two case studies from our work in Pakistan and Niger in which documents provided critical insight and advanced empirical and theoretical understanding of a health policy issue. Coding tools for each case study are included as Supplementary Files to inspire and guide future research. These case studies illustrate the value of rigorous document analysis to understand policy content and processes and discourse around policy, in ways that are either not possible using other methods, or greatly enrich other methods such as in-depth interviews and observation. Given the central nature of documents to health policy research and importance of reading them critically, the READ approach provides practical guidance on gaining the most out of documents and ensuring rigour in document analysis.

Rigour in qualitative research is judged partly by the use of deliberate, systematic procedures; however, little specific guidance is available for analysing documents, a nonetheless common method in health policy research.

Document analysis is useful for understanding policy content across time and geographies, documenting processes, triangulating with interviews and other sources of data, understanding how information and ideas are presented formally, and understanding issue framing, among other purposes.

The READ (Ready materials, Extract data, Analyse data, Distil) approach provides a step-by-step guide to conducting document analysis for qualitative policy research.

The READ approach can be adapted to different purposes and types of research, two examples of which are presented in this article, with sample tools in the Supplementary Materials .

Document analysis (also called document review) is one of the most commonly used methods in health policy research; it is nearly impossible to conduct policy research without it. Writing in early 20th century, Weber (2015) identified the importance of formal, written documents as a key characteristic of the bureaucracies by which modern societies function, including in public health. Accordingly, critical social research has a long tradition of documentary review: Marx analysed official reports, laws, statues, census reports and newspapers and periodicals over a nearly 50-year period to come to his world-altering conclusions ( Harvey, 1990 ). Yet in much of social science research, ‘documents are placed at the margins of consideration,’ with privilege given to the spoken word via methods such as interviews, possibly due to the fact that many qualitative methods were developed in the anthropological tradition to study mainly pre-literate societies ( Prior, 2003 ). To date, little specific guidance is available to help health policy researchers make the most of these wells of information.

The term ‘documents’ is defined here broadly, following Prior, as physical or virtual artefacts designed by creators, for users, to function within a particular setting ( Prior, 2003 ). Documents exist not as standalone objects of study but must be understood in the social web of meaning within which they are produced and consumed. For example, some analysts distinguish between public documents (produced in the context of public sector activities), private documents (from business and civil society) and personal documents (created by or for individuals, and generally not meant for public consumption) ( Mogalakwe, 2009 ). Documents can be used in a number of ways throughout the research process ( Bowen, 2009 ). In the planning or study design phase, they can be used to gather background information and help refine the research question. Documents can also be used to spark ideas for disseminating research once it is complete, by observing the ways those who will use the research speak to and communicate ideas with one another.

Documents can also be used during data collection and analysis to help answer research questions. Recent health policy research shows that this can be done in at least four ways. Frequently, policy documents are reviewed to describe the content or categorize the approaches to specific health problems in existing policies, as in reviews of the composition of drowning prevention resources in the United States or policy responses to foetal alcohol spectrum disorder in South Africa ( Katchmarchi et al. , 2018 ; Adebiyi et al. , 2019 ). In other cases, non-policy documents are used to examine the implementation of health policies in real-world settings, as in a review of web sources and newspapers analysing the functioning of community health councils in New Zealand ( Gurung et al. , 2020 ). Perhaps less frequently, document analysis is used to analyse policy processes, as in an assessment of multi-sectoral planning process for nutrition in Burkina Faso ( Ouedraogo et al. , 2020 ). Finally, and most broadly, document analysis can be used to inform new policies, as in one study that assessed cigarette sticks as communication and branding ‘documents,’ to suggest avenues for further regulation and tobacco control activities ( Smith et al. , 2017 ).

This practice paper provides an overarching method for conducting document analysis, which can be adapted to a multitude of research questions and topics. Document analysis is used in most or all policy studies; the aim of this article is to provide a systematized method that will enhance procedural rigour. We provide an overview of document analysis, drawing on guidance from disciplines adjacent to public health, introduce the ‘READ’ approach to document analysis and provide two short case studies demonstrating how document analysis can be applied.

Document analysis is a systematic procedure for reviewing or evaluating documents, which can be used to provide context, generate questions, supplement other types of research data, track change over time and corroborate other sources ( Bowen, 2009 ). In one commonly cited approach in social research, Bowen recommends first skimming the documents to get an overview, then reading to identify relevant categories of analysis for the overall set of documents and finally interpreting the body of documents ( Bowen, 2009 ). Document analysis can include both quantitative and qualitative components: the approach presented here can be used with either set of methods, but we emphasize qualitative ones, which are more adapted to the socially constructed meaning-making inherent to collaborative exercises such as policymaking.

The study of documents as a research method is common to a number of social science disciplines—yet in many of these fields, including sociology ( Mogalakwe, 2009 ), anthropology ( Prior, 2003 ) and political science ( Wesley, 2010 ), document-based research is described as ill-considered and underutilized. Unsurprisingly, textual analysis is perhaps most developed in fields such as media studies, cultural studies and literary theory, all disciplines that recognize documents as ‘social facts’ that are created, consumed, shared and utilized in socially organized ways ( Atkinson and Coffey, 1997 ). Documents exist within social ‘fields of action,’ a term used to designate the environments within which individuals and groups interact. Documents are therefore not mere records of social life, but integral parts of it—and indeed can become agents in their own right ( Prior, 2003 ). Powerful entities also manipulate the nature and content of knowledge; therefore, gaps in available information must be understood as reflecting and potentially reinforcing societal power relations ( Bryman and Burgess, 1994 ).

Document analysis, like any research method, can be subject to concerns regarding validity, reliability, authenticity, motivated authorship, lack of representativity and so on. However, these can be mitigated or avoided using standard techniques to enhance qualitative rigour, such as triangulation (within documents and across methods and theoretical perspectives), ensuring adequate sample size or ‘engagement’ with the documents, member checking, peer debriefing and so on ( Maxwell, 2005 ).

Document analysis can be used as a standalone method, e.g. to analyse the contents of specific types of policy as they evolve over time and differ across geographies, but document analysis can also be powerfully combined with other types of methods to cross-validate (i.e. triangulate) and deepen the value of concurrent methods. As one guide to public policy research puts it, ‘almost all likely sources of information, data, and ideas fall into two general types: documents and people’ ( Bardach and Patashnik, 2015 ). Thus, researchers can ask interviewees to address questions that arise from policy documents and point the way to useful new documents. Bardach and Patashnik suggest alternating between documents and interviews as sources as information, as one tends to lead to the other, such as by scanning interviewees’ bookshelves and papers for titles and author names ( Bardach and Patashnik, 2015 ). Depending on your research questions, document analysis can be used in combination with different types of interviews ( Berner-Rodoreda et al. , 2018 ), observation ( Harvey, 2018 ), and quantitative analyses, among other common methods in policy research.

The READ approach to document analysis is a systematic procedure for collecting documents and gaining information from them in the context of health policy studies at any level (global, national, local, etc.). The steps consist of: (1) ready your materials, (2) extract data, (3) analyse data and (4) distil your findings. We describe each of these steps in turn.

Step 1. Ready your materials

At the outset, researchers must set parameters in terms of the nature and number (approximately) of documents they plan to analyse, based on the research question. How much time will you allocate to the document analysis, and what is the scope of your research question? Depending on the answers to these questions, criteria should be established around (1) the topic (a particular policy, programme, or health issue, narrowly defined according to the research question); (2) dates of inclusion (whether taking the long view of several decades, or zooming in on a specific event or period in time); and (3) an indicative list of places to search for documents (possibilities include databases such as Ministry archives; LexisNexis or other databases; online searches; and particularly interview subjects). For difficult-to-obtain working documents or otherwise non-public items, bringing a flash drive to interviews is one of the best ways to gain access to valuable documents.

For research focusing on a single policy or programme, you may review only a handful of documents. However, if you are looking at multiple policies, health issues, or contexts, or reviewing shorter documents (such as newspaper articles), you may look at hundreds, or even thousands of documents. When considering the number of documents you will analyse, you should make notes on the type of information you plan to extract from documents—i.e. what it is you hope to learn, and how this will help answer your research question(s). The initial criteria—and the data you seek to extract from documents—will likely evolve over the course of the research, as it becomes clear whether they will yield too few documents and information (a rare outcome), far too many documents and too much information (a much more common outcome) or documents that fail to address the research question; however, it is important to have a starting point to guide the search. If you find that the documents you need are unavailable, you may need to reassess your research questions or consider other methods of inquiry. If you have too many documents, you can either analyse a subset of these ( Panel 1 ) or adopt more stringent inclusion criteria.

Exploring the framing of diseases in Pakistani media

In Table 1 , we present a non-exhaustive list of the types of documents that can be included in document analyses of health policy issues. In most cases, this will mean written sources (policies, reports, articles). The types of documents to be analysed will vary by study and according to the research question, although in many cases, it will be useful to consult a mix of formal documents (such as official policies, laws or strategies), ‘gray literature’ (organizational materials such as reports, evaluations and white papers produced outside formal publication channels) and, whenever possible, informal or working documents (such as meeting notes, PowerPoint presentations and memoranda). These latter in particular can provide rich veins of insight into how policy actors are thinking through the issues under study, particularly for the lucky researcher who obtains working documents with ‘Track Changes.’ How you prioritize documents will depend on your research question: you may prioritize official policy documents if you are studying policy content, or you may prioritize informal documents if you are studying policy process.

Types of documents that can be consulted in studies of health policy

During this initial preparatory phase, we also recommend devising a file-naming system for your documents (e.g. Author.Date.Topic.Institution.PDF), so that documents can be easily retrieved throughout the research process. After extracting data and processing your documents the first time around, you will likely have additional ‘questions’ to ask your documents and need to consult them again. For this reason, it is important to clearly name source files and link filenames to the data that you are extracting (see sample naming conventions in the Supplementary Materials ).

Step 2. Extract data

Data can be extracted in a number of ways, and the method you select for doing so will depend on your research question and the nature of your documents. One simple way is to use an Excel spreadsheet where each row is a document and each column is a category of information you are seeking to extract, from more basic data such as the document title, author and date, to theoretical or conceptual categories deriving from your research question, operating theory or analytical framework (Panel 2). Documents can also be imported into thematic coding software such as Atlas.ti or NVivo, and data extracted that way. Alternatively, if the research question focuses on process, documents can be used to compile a timeline of events, to trace processes across time. Ask yourself, how can I organize these data in the most coherent manner? What are my priority categories? We have included two different examples of data extraction tools in the Supplementary Materials to this article to spark ideas.

Case study Documents tell part of the story in Niger

Document analyses are first and foremost exercises in close reading: documents should be read thoroughly, from start to finish, including annexes, which may seem tedious but which sometimes produce golden nuggets of information. Read for overall meaning as you extract specific data related to your research question. As you go along, you will begin to have ideas or build working theories about what you are learning and observing in the data. We suggest capturing these emerging theories in extended notes or ‘memos,’ as used in Grounded Theory methodology ( Charmaz, 2006 ); these can be useful analytical units in themselves and can also provide a basis for later report and article writing.

As you read more documents, you may find that your data extraction tool needs to be modified to capture all the relevant information (or to avoid wasting time capturing irrelevant information). This may require you to go back and seek information in documents you have already read and processed, which will be greatly facilitated by a coherent file-naming system. It is also useful to keep notes on other documents that are mentioned that should be tracked down (sometimes you can write the author for help). As a general rule, we suggest being parsimonious when selecting initial categories to extract from data. Simply reading the documents takes significant time in and of itself—make sure you think about how, exactly, the specific data you are extracting will be used and how it goes towards answering your research questions.

Step 3. Analyse data

As in all types of qualitative research, data collection and analysis are iterative and characterized by emergent design, meaning that developing findings continually inform whether and how to obtain and interpret data ( Creswell, 2013 ). In practice, this means that during the data extraction phase, the researcher is already analysing data and forming initial theories—as well as potentially modifying document selection criteria. However, only when data extraction is complete can one see the full picture. For example, are there any documents that you would have expected to find, but did not? Why do you think they might be missing? Are there temporal trends (i.e. similarities, differences or evolutions that stand out when documents are ordered chronologically)? What else do you notice? We provide a list of overarching questions you should think about when viewing your body of document as a whole ( Table 2 ).

Questions to ask your overall body of documents

HIV and viral hepatitis articles by main frames (%). Note: The percentage of articles is calculated by dividing the number of articles appearing in each frame for viral hepatitis and HIV by the respectivenumber of sampled articles for each disease (N = 137 for HIV; N = 117 for hepatitis). Time frame: 1 January 2006 to 30 September 2016

HIV and viral hepatitis articles by main frames (%). Note: The percentage of articles is calculated by dividing the number of articles appearing in each frame for viral hepatitis and HIV by the respectivenumber of sampled articles for each disease (N = 137 for HIV; N = 117 for hepatitis). Time frame: 1 January 2006 to 30 September 2016

Representations of progress toward Millennium Development Goal 4 in Nigerien policy documents. Sources: clockwise from upper left: (WHO 2006); (Institut National de la Statistique 2010); (Ministè re de la Santé Publique 2010); (Unicef 2010)

Representations of progress toward Millennium Development Goal 4 in Nigerien policy documents. Sources: clockwise from upper left: ( WHO 2006 ); ( Institut National de la Statistique 2010 ); ( Ministè re de la Santé Publique 2010 ); ( Unicef 2010 )

In addition to the meaning-making processes you are already engaged in during the data extraction process, in most cases, it will be useful to apply specific analysis methodologies to the overall corpus of your documents, such as policy analysis ( Buse et al. , 2005 ). An array of analysis methodologies can be used, both quantitative and qualitative, including case study methodology, thematic content analysis, discourse analysis, framework analysis and process tracing, which may require differing levels of familiarity and skills to apply (we highlight a few of these in the case studies below). Analysis can also be structured according to theoretical approaches. When it comes to analysing policies, process tracing can be particularly useful to combine multiple sources of information, establish a chronicle of events and reveal political and social processes, so as to create a narrative of the policy cycle ( Yin, 1994 ; Shiffman et al. , 2004 ). Practically, you will also want to take a holistic view of the documents’ ‘answers’ to the questions or analysis categories you applied during the data extraction phase. Overall, what did the documents ‘say’ about these thematic categories? What variation did you find within and between documents, and along which axes? Answers to these questions are best recorded by developing notes or memos, which again will come in handy as you write up your results.

As with all qualitative research, you will want to consider your own positionality towards the documents (and their sources and authors); it may be helpful to keep a ‘reflexivity’ memo documenting how your personal characteristics or pre-standing views might influence your analysis ( Watt, 2007 ).

Step 4. Distil your findings

You will know when you have completed your document review when one of the three things happens: (1) completeness (you feel satisfied you have obtained every document fitting your criteria—this is rare), (2) out of time (this means you should have used more specific criteria), and (3) saturation (you fully or sufficiently understand the phenomenon you are studying). In all cases, you should strive to make the third situation the reason for ending your document review, though this will not always mean you will have read and analysed every document fitting your criteria—just enough documents to feel confident you have found good answers to your research questions.

Now it is time to refine your findings. During the extraction phase, you did the equivalent of walking along the beach, noticing the beautiful shells, driftwood and sea glass, and picking them up along the way. During the analysis phase, you started sorting these items into different buckets (your analysis categories) and building increasingly detailed collections. Now you have returned home from the beach, and it is time to clean your objects, rinse them of sand and preserve only the best specimens for presentation. To do this, you can return to your memos, refine them, illustrate them with graphics and quotes and fill in any incomplete areas. It can also be illuminating to look across different strands of work: e.g. how did the content, style, authorship, or tone of arguments evolve over time? Can you illustrate which words, concepts or phrases were used by authors or author groups?

Results will often first be grouped by theoretical or analytic category, or presented as a policy narrative, interweaving strands from other methods you may have used (interviews, observation, etc.). It can also be helpful to create conceptual charts and graphs, especially as this corresponds to your analytical framework (Panels 1 and 2). If you have been keeping a timeline of events, you can seek out any missing information from other sources. Finally, ask yourself how the validity of your findings checks against what you have learned using other methods. The final products of the distillation process will vary by research study, but they will invariably allow you to state your findings relative to your research questions and to draw policy-relevant conclusions.

Document analysis is an essential component of health policy research—it is also relatively convenient and can be low cost. Using an organized system of analysis enhances the document analysis’s procedural rigour, allows for a fuller understanding of policy process and content and enhances the effectiveness of other methods such as interviews and non-participant observation. We propose the READ approach as a systematic method for interrogating documents and extracting study-relevant data that is flexible enough to accommodate many types of research questions. We hope that this article encourages discussion about how to make best use of data from documents when researching health policy questions.

Supplementary data are available at Health Policy and Planning online.

The data extraction tool in the Supplementary Materials for the iCCM case study (Panel 2) was conceived of by the research team for the multi-country study ‘Policy Analysis of Community Case Management for Childhood and Newborn Illnesses’. The authors thank Sara Bennett and Daniela Rodriguez for granting permission to publish this tool. S.M. was supported by The Olympia-Morata-Programme of Heidelberg University. The funders had no role in the decision to publish, or preparation of the manuscript. The content is the responsibility of the authors and does not necessarily represent the views of any funder.

Conflict of interest statement . None declared.

Ethical approval. No ethical approval was required for this study.

Abdelmutti N , Hoffman-Goetz L.   2009 . Risk messages about HPV, cervical cancer, and the HPV vaccine Gardasil: a content analysis of Canadian and U.S. national newspaper articles . Women & Health   49 : 422 – 40 .

Google Scholar

Adebiyi BO , Mukumbang FC , Beytell A-M.   2019 . To what extent is fetal alcohol spectrum disorder considered in policy-related documents in South Africa? A document review . Health Research Policy and Systems   17 :

Atkinson PA , Coffey A.   1997 . Analysing documentary realities. In: Silverman D (ed). Qualitative Research: Theory, Method and Practice . London : SAGE .

Google Preview

Bardach E , Patashnik EM.   2015 . Practical Guide for Policy Analysis: The Eightfold Path to More Effective Problem Solving . Los Angeles : SAGE .

Bennett S , Dalglish SL , Juma PA , Rodríguez DC.   2015 . Altogether now… understanding the role of international organizations in iCCM policy transfer . Health Policy and Planning   30 : ii26 – 35 .

Berner-Rodoreda A , Bärnighausen T , Kennedy C  et al.    2018 . From doxastic to epistemic: a typology and critique of qualitative interview styles . Qualitative Inquiry   26 : 291 – 305 . 1077800418810724.

Bowen GA.   2009 . Document analysis as a qualitative research method . Qualitative Research Journal   9 : 27 – 40 .

Bryman A.   1994 . Analyzing Qualitative Data .

Buse K , Mays N , Walt G.   2005 . Making Health Policy . New York : Open University Press .

Charmaz K.   2006 . Constructing Grounded Theory: A Practical Guide through Qualitative Analysis . London : SAGE .

Claassen L , Smid T , Woudenberg F , Timmermans DRM.   2012 . Media coverage on electromagnetic fields and health: content analysis of Dutch newspaper articles and websites . Health, Risk & Society   14 : 681 – 96 .

Creswell JW.   2013 . Qualitative Inquiry and Research Design . Thousand Oaks, CA : SAGE .

Dalglish SL , Rodríguez DC , Harouna A , Surkan PJ.   2017 . Knowledge and power in policy-making for child survival in Niger . Social Science & Medicine   177 : 150 – 7 .

Dalglish SL , Surkan PJ , Diarra A , Harouna A , Bennett S.   2015 . Power and pro-poor policies: the case of iCCM in Niger . Health Policy and Planning   30 : ii84 – 94 .

Entman RM.   1993 . Framing: toward clarification of a fractured paradigm . Journal of Communication   43 : 51 – 8 .

Fournier G , Djermakoye IA.   1975 . Village health teams in Niger (Maradi Department). In: Newell KW (ed). Health by the People . Geneva : WHO .

Gurung G , Derrett S , Gauld R.   2020 . The role and functions of community health councils in New Zealand’s health system: a document analysis . The New Zealand Medical Journal   133 : 70 – 82 .

Harvey L.   1990 . Critical Social Research . London : Unwin Hyman .

Harvey SA.   2018 . Observe before you leap: why observation provides critical insights for formative research and intervention design that you’ll never get from focus groups, interviews, or KAP surveys . Global Health: Science and Practice   6 : 299 – 316 .

Institut National de la Statistique. 2010. Rapport National sur les Progrès vers l'atteinte des Objectifs du Millénaire pour le Développement. Niamey, Niger: INS.

Kamarulzaman A.   2013 . Fighting the HIV epidemic in the Islamic world . Lancet   381 : 2058 – 60 .

Katchmarchi AB , Taliaferro AR , Kipfer HJ.   2018 . A document analysis of drowning prevention education resources in the United States . International Journal of Injury Control and Safety Promotion   25 : 78 – 84 .

Krippendorff K.   2004 . Content Analysis: An Introduction to Its Methodology . SAGE .

Marten R.   2019 . How states exerted power to create the Millennium Development Goals and how this shaped the global health agenda: lessons for the sustainable development goals and the future of global health . Global Public Health   14 : 584 – 99 .

Maxwell JA.   2005 . Qualitative Research Design: An Interactive Approach , 2 nd edn. Thousand Oaks, CA : Sage Publications .

Mayring P.   2004 . Qualitative Content Analysis . In: Flick U, von Kardorff E, Steinke I (eds).   A Companion to Qualitative Research . SAGE .

Ministère de la Santé Publique. 2010. Enquête nationale sur la survie des enfants de 0 à 59 mois et la mortalité au Niger 2010. Niamey, Niger: MSP.

Mogalakwe M.   2009 . The documentary research method—using documentary sources in social research . Eastern Africa Social Science Research Review   25 : 43 – 58 .

Nelkin D.   1991 . AIDS and the news media . The Milbank Quarterly   69 : 293 – 307 .

Ouedraogo O , Doudou MH , Drabo KM  et al.    2020 . Policy overview of the multisectoral nutrition planning process: the progress, challenges, and lessons learned from Burkina Faso . The International Journal of Health Planning and Management   35 : 120 – 39 .

Prior L.   2003 . Using Documents in Social Research . London: SAGE .

Shiffman J , Stanton C , Salazar AP.   2004 . The emergence of political priority for safe motherhood in Honduras . Health Policy and Planning   19 : 380 – 90 .

Smith KC , Washington C , Welding K  et al.    2017 . Cigarette stick as valuable communicative real estate: a content analysis of cigarettes from 14 low-income and middle-income countries . Tobacco Control   26 : 604 – 7 .

Strömbäck J , Dimitrova DV.   2011 . Mediatization and media interventionism: a comparative analysis of Sweden and the United States . The International Journal of Press/Politics   16 : 30 – 49 .

UNICEF. 2010. Maternal, Newborn & Child Surival Profile. Niamey, Niger: UNICEF

Watt D.   2007 . On becoming a qualitative researcher: the value of reflexivity . Qualitative Report   12 : 82 – 101 .

Weber M.   2015 . Bureaucracy. In: Waters T , Waters D (eds). Rationalism and Modern Society: New Translations on Politics, Bureaucracy, and Social Stratification . London : Palgrave MacMillan .

Wesley JJ.   2010 . Qualitative Document Analysis in Political Science.

World Health Organization. 2006. Country Health System Fact Sheet 2006: Niger. Niamey, Niger: WHO.

Yin R.   1994 . Case Study Research: Design and Methods . Thousand Oaks, CA : Sage .

Supplementary data

Email alerts, citing articles via.

  • Recommend to Your Librarian

Affiliations

  • Online ISSN 1460-2237
  • Copyright © 2024 The London School of Hygiene and Tropical Medicine and Oxford University Press
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

To read this content please select one of the options below:

Please note you do not have access to teaching notes, document analysis as a qualitative research method.

Qualitative Research Journal

ISSN : 1443-9883

Article publication date: 3 August 2009

This article examines the function of documents as a data source in qualitative research and discusses document analysis procedure in the context of actual research experiences. Targeted to research novices, the article takes a nuts‐and‐bolts approach to document analysis. It describes the nature and forms of documents, outlines the advantages and limitations of document analysis, and offers specific examples of the use of documents in the research process. The application of document analysis to a grounded theory study is illustrated.

  • Content analysis
  • Grounded theory
  • Thematic analysis
  • Triangulation

Bowen, G.A. (2009), "Document Analysis as a Qualitative Research Method", Qualitative Research Journal , Vol. 9 No. 2, pp. 27-40. https://doi.org/10.3316/QRJ0902027

Emerald Group Publishing Limited

Copyright © 2009, Emerald Group Publishing Limited

Related articles

We’re listening — tell us what you think, something didn’t work….

Report bugs here

All feedback is valuable

Please share your general feedback

Join us on our journey

Platform update page.

Visit emeraldpublishing.com/platformupdate to discover the latest news and updates

Questions & More Information

Answers to the most commonly asked questions here

Monday, January 20, 2020

A QDA recipe? A ten-step approach for qualitative document analysis using MAXQDA

what is document review in qualitative research

Guest post by Professional MAXQDA Trainer Dr. Daniel Rasch .

Introduction

Qualitative text or document analysis has evolved into one of the most used qualitative methods across several disciplines ( Kuckartz, 2014 & Mayring, 2010). Its straightforward structure and procedure enable the researcher to adapt the method to his or her special case – nearly to every need.

A ten-steps-approach for qualitative document analysis using MAXQDA

This article proposes a recipe of ten simple steps for conducting qualitative document analyses (QDA) using MAXQDA (see table 1 for an overview).

Table 1: Overview of the “QDA recipe”

The ten steps for conducting qualitative document analyses using MAXQDA

Step 1: the research question(s).

As always, research begins with the question(s). Three aspects should be covered when dealing with the research question(s):

  • What do you want to find out exactly,
  • what relevance does your research on this exact question have, and
  • what contribution is your research going to make to your discipline?

Highlight these questions in your introduction and make your research stand out.

Step 2: Data collection and data sampling

After you have decided on the questions, you should think about how to answer them. What kind of qualitative data will best answer your question? Interviews – how many and with whom? Documents – which ones and where to collect them from?

At this point, you can already start thinking about validity: are you going to use a representative or a biased sample? Check the different options for sampling and its effects on validity ( Krippendorff, 2019 ).

Step 3: Select and prepare the data

For this step, MAXQDA 2020 is an excellent tool to help you prepare the selected data for any further steps . Whatever type of qualitative data you choose, you can import it into MAXQDA and then you can have MAXQDA assist in transcribing it. In the end, qualitative document analysis is all about written forms of communication (Kuckartz, 2014).

Document analysis: Figure 1: Import the data you have chosen or selected

Figure 1: Import the data you have chosen or selected

Step 4: Codebook development

It takes time to develop a solid codebook. Working deductively, the process is a little easier with codes deriving from the theoretical considerations in the context of your research. Inductively, there are various steps you can use, ranging from creative coding to in-vivo-codes.

Content-wise, you can apply all sorts of codes, such as themes or evaluations, two of the most commonly used styles of content analysis (see thematic and evaluative content analysis in Kuckartz, 2014).

Document analysis: Figure 2: coding options in MAXQDA

Figure 2: coding options in MAXQDA

  • a brief definition,
  • a long definition,
  • criteria for when to use the code, 
  • criteria for when not to use the code, and
  • an example.

Using MAXQDA’s code memos simplify the process of creating and maintaining a good codebook . First, you can always go back to the codes and view and review your codebook within your project, and second, you can simply export the codebook as an attachment or appendix for publication purposes (use: Reports > Codebook ).

Document analysis: Figure 3: Creating a new code with code memo

Figure 3: Creating a new code with code memo

Step 5: Unitizing and coding instructions

Before the process of coding starts, it is necessary to decide on the units of, as well as the rules for, coding. It is especially important to decide on your unit of coding (sentences, paragraphs, quasi-sentences, etc.). Coding rules help to keep this choice consistent and support you to stick to your research question(s) because every passage you code and every memo you write should be done in order to answer your research question(s). Decision rules should be added: what are you going to do if a passage does not fit in your subcodes but should be coded because it is important for your research question?

Step 6: Trial, training, reliability

Trial runs are of major importance. Not only do they show you, which codes work and which do not, but they also help you to rethink your choices in terms of the unit of coding, the content of the codebook, and reliability. Since there are different options for the latter, stick to what works best for you: either a qualitative comparison of what you have coded or quantitative indicators like Krippendorff’s alpha if need be .

You can test yourself or a team you work with and there might even be some situations, where a reliability test is not helpful or needed. When testing the codebook, be sure to test the variability of your collected documents and be sure that the entire codebook is tested. 

MAXQDA helps you compare different forms of agreement for more an unlimited number of texts, divided into two different document groups (one document group coded by coder 1, a second document group coded by coder 2 – be aware, that you can also test yourself and be coder 2 yourself).

Document analysis: Figure 4: Intercoder agreement

Figure 4: Intercoder agreement

Step 7: Revision and modification

After checking, which codes work and which do not, you can revise the codebook and modify it. As Schreier puts it: “No coding frame (codebook – DR) is perfect” (Schreier, 2012: 147).

Step 8: Coding

There are many different coding strategies, but one thing is for sure: qualitative work needs time and reading, as well as working with the material over and over again.

One coding strategy might be to first make yourself comfortable with the documents and start coding after second or third reading only. Another strategy is to concentrate on some of your codes first and do a second round of coding with the other codes later.

Step 9: Analyze and compare

Analyze and compare – these two words are the essence of the qualitative analysis at this step. At the core of each qualitative document analysis is the description of the content and the comparison of these contents between the documents you analyze.

After everything has been coded, you can make use of different analysis strategies: paraphrase, write summaries, look for intersections of codes, patterns of likeliness between the documents using simple or complex queries.

Document analysis: Figure 5: different analysis strategies in MAXQDA

Figure 5: different analysis strategies in MAXQDA

Step 10: Interpretation and presentation

Reporting and summarizing qualitative findings is difficult. Most often, we find simple descriptions of the content with the use of quotations, paraphrases or other references to the text. However, MAXQDA makes it fast and easier with many options to choose from . The easiest way is to generate a table to sum up your findings – if your data or the findings allow for this.

MAXQDA offers several options: either map relations of codes, documents or memos with the MAXMaps , create matrices between codes and documents ( Code Matrix Browser ) or codes and codes ( Code Relations Browser ) to display the distribution of codes inside your data or even using different colors to map the distribution of codes or single documents.

Figure 6: Visual Tools for presentation

Figure 6: Visual Tools for presentation

The Code Matrix Browser also enables you to quantify the qualitative data using two clicks. You can export these numbers for further analysis with statistical packages, to run causal relation and effect calculations, such as regressions or correlations ( Rasch, 2018 ).

Summary and adoption

Qualitative document analysis is one of the most popular techniques and adaptable to nearly every field. MAXQDA is a software tool that offers many options to make your analysis and therefore your research easier .

The recipe works best for theory-driven, deductive coding. However, it can be also used for inductive, explorative work by switching some of these steps around: for example, your codebook development might be one step to do during or after the trial and testing, since codes are developed inductively during the coding process. Still, it is important to define these codes properly.

The above-mentioned recipe has been used as a basis for several publications by the author. Starting with simple comparison of qualitative and quantitative text analysis ( Boräng et al., 2014 ), to the usage of the qualitative data as a basis for regression models ( Eising et al., 2015 ; Eising et al., 2017 ) to a book using mixed methods and therefore both qualitative and quantitative data analysis ( Rasch, 2018 ).

About the author

Daniel Rasch is a post-doctoral researcher in political science at the German University of Administrative Sciences, Speyer. He received his Ph.D. with a mixed methods analysis of lobbyists‘ success in the European Union. He focuses on the quantification of qualitative data. He is an experienced MAXQDA lecturer and has been a Professional MAXQDA Trainer since 2012.

MAXQDA Newsletter

Our research and analysis tips, straight to your inbox.

Similar Articles

  • #ResearchforChange Grants (46)
  • Conferences & Events (31)
  • Field Work Diary (39)
  • Learning MAXQDA (110)
  • Research Projects (133)
  • Tip of the Month (57)
  • Uncategorized (8)
  • Updates (62)
  • VERBI News (70)

what is document review in qualitative research

The Qualitative Report

Home > HCAS > HCAS_PUBS > HCAS_JOURNALS > TQR Home > TQR > Vol. 27 > No. 1 (2022)

Conducting a Qualitative Document Analysis

Hani Morgan , University of Southern Mississippi Follow

Document analysis has been an underused approach to qualitative research. This approach can be valuable for various reasons. When used to analyze pre-existing texts, this method allows researchers to conduct studies they might otherwise not be able to complete. Some researchers may not have the resources or time needed to do field research. Although videoconferencing technology and other types of software can be used to reduce some of the obstacles qualitative researchers sometimes encounter, these tools are associated with various problems. Participants might be unskillful in using technology or may not be able to afford it. Conducting a document analysis can also reduce some of the ethical concerns associated with other qualitative methods. Since document analysis is a valuable research method, one would expect to find a wide variety of literature on this topic. Unfortunately, the literature on documentary research is scant. This paper is designed to close the gap in the literature on conducting a qualitative document analysis by focusing on the advantages and limitations of using documents as a source of data and providing strategies for selecting documents. It also offers reasons for using reflexive thematic analysis and includes a hypothetical example of how a researcher might conduct a document analysis.

document analysis, qualitative inquiry, reflexive thematic analysis

Author Bio(s)

Hani Morgan is a professor of education at the University of Southern Mississippi. He received his doctorate in Social and Philosophical Foundations of Education from Rutgers University. Morgan is the author of The World’s Highest-Scoring Students and the co-editor of The World Leaders in Education . He has also authored and co-authored more than 60 journal articles. Much of his research focuses on how various factors related to the learning environment affect students. Please direct correspondence to: University of Southern Mississippi, 118 College Drive #5057, Hattiesburg, MS 39406-0001; Email: [email protected] .

Publication Date

Creative commons license.

Creative Commons Attribution-Noncommercial 4.0 License

10.46743/2160-3715/2022.5044

Recommended APA Citation

Morgan, H. (2022). Conducting a Qualitative Document Analysis. The Qualitative Report , 27 (1), 64-77. https://doi.org/10.46743/2160-3715/2022.5044

Since January 03, 2022

Included in

Quantitative, Qualitative, Comparative, and Historical Methodologies Commons

https://doi.org/10.46743/2160-3715/2022.5044

To view the content in your browser, please download Adobe Reader or, alternately, you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.

  • The Qualitative Report
  • About This Journal
  • Aims & Scope
  • Editorial Board
  • Open Access

TQR Publications

  • The Qualitative Report Conference
  • TQR Weekly Newsletter
  • Submit Article
  • Most Popular Papers
  • Receive Email Notices or RSS

Special Issues:

  • Volume 25 - Issue 13 - 4th World Conference on Qualitative Research Special Issue
  • World Conference on Qualitative Research Special Issue
  • Reflecting on the Future of QDA Software
  • Volume 22, Number 13: Asian Qualitative Research Association Special Issue - December 2017

Advanced Search

Print ISSN: 1052-0147

E-ISSN: 2160-3715

Follow TQR on:

Twitter

Submission Locations

  • View submissions on map
  • View submissions in Google Earth

what is document review in qualitative research

Home | About | FAQ | My Account | Accessibility Statement

Privacy Copyright

This website may not work correctly because your browser is out of date. Please update your browser .

Data collection methods for evaluation: Document review

Resource link.

  • Data collection methods for evaluation - document review (PDF, 162KB)

This resource from the Centers for Disease Control and Prevention (CDC) provides a brief guide to using document review as a data collection method for evaluation.

This guide provides an overview of when to use document review, how to plan and conduct it, and its advantages and disadvantages. It is noted that document review is helpful for gathering background information, determining if program implementation reflects program plans, and developing other data collection tools for evaluation.

Document review has several advantages, including being relatively inexpensive, providing a behind-the-scenes look at a program that may not be directly observable, and bringing up issues not noted by other means. However, there are also potential disadvantages, such as information being incomplete or inaccurate, biased due to selective survival of information, and time-consuming to collect, review, and analyze many documents.

Centers for Disease Control and Prevention (CDC), (2018).  Data collection methods for evaluation: Document review  (No. 18). U.S. Dept. of Health and Human Services. https://www.cdc.gov/healthyyouth/evaluation/pdf/brief18.pdf

'Data collection methods for evaluation: Document review' is referenced in:

  • Existing documents
  • Best of AEA365: Approaching document review in a systematic way

Back to top

© 2022 BetterEvaluation. All right reserved.

Criteria for Good Qualitative Research: A Comprehensive Review

  • Regular Article
  • Open access
  • Published: 18 September 2021
  • Volume 31 , pages 679–689, ( 2022 )

Cite this article

You have full access to this open access article

  • Drishti Yadav   ORCID: orcid.org/0000-0002-2974-0323 1  

65k Accesses

19 Citations

72 Altmetric

Explore all metrics

This review aims to synthesize a published set of evaluative criteria for good qualitative research. The aim is to shed light on existing standards for assessing the rigor of qualitative research encompassing a range of epistemological and ontological standpoints. Using a systematic search strategy, published journal articles that deliberate criteria for rigorous research were identified. Then, references of relevant articles were surveyed to find noteworthy, distinct, and well-defined pointers to good qualitative research. This review presents an investigative assessment of the pivotal features in qualitative research that can permit the readers to pass judgment on its quality and to condemn it as good research when objectively and adequately utilized. Overall, this review underlines the crux of qualitative research and accentuates the necessity to evaluate such research by the very tenets of its being. It also offers some prospects and recommendations to improve the quality of qualitative research. Based on the findings of this review, it is concluded that quality criteria are the aftereffect of socio-institutional procedures and existing paradigmatic conducts. Owing to the paradigmatic diversity of qualitative research, a single and specific set of quality criteria is neither feasible nor anticipated. Since qualitative research is not a cohesive discipline, researchers need to educate and familiarize themselves with applicable norms and decisive factors to evaluate qualitative research from within its theoretical and methodological framework of origin.

Similar content being viewed by others

what is document review in qualitative research

What is Qualitative in Qualitative Research

Patrik Aspers & Ugo Corte

what is document review in qualitative research

The multi-criteria evaluation of research efforts based on ETL software: from business intelligence approach to big data and semantic approaches

Chaimae Boulahia, Hicham Behja, … Zoubair Boulahia

what is document review in qualitative research

Sampling Techniques for Quantitative Research

Avoid common mistakes on your manuscript.

Introduction

“… It is important to regularly dialogue about what makes for good qualitative research” (Tracy, 2010 , p. 837)

To decide what represents good qualitative research is highly debatable. There are numerous methods that are contained within qualitative research and that are established on diverse philosophical perspectives. Bryman et al., ( 2008 , p. 262) suggest that “It is widely assumed that whereas quality criteria for quantitative research are well‐known and widely agreed, this is not the case for qualitative research.” Hence, the question “how to evaluate the quality of qualitative research” has been continuously debated. There are many areas of science and technology wherein these debates on the assessment of qualitative research have taken place. Examples include various areas of psychology: general psychology (Madill et al., 2000 ); counseling psychology (Morrow, 2005 ); and clinical psychology (Barker & Pistrang, 2005 ), and other disciplines of social sciences: social policy (Bryman et al., 2008 ); health research (Sparkes, 2001 ); business and management research (Johnson et al., 2006 ); information systems (Klein & Myers, 1999 ); and environmental studies (Reid & Gough, 2000 ). In the literature, these debates are enthused by the impression that the blanket application of criteria for good qualitative research developed around the positivist paradigm is improper. Such debates are based on the wide range of philosophical backgrounds within which qualitative research is conducted (e.g., Sandberg, 2000 ; Schwandt, 1996 ). The existence of methodological diversity led to the formulation of different sets of criteria applicable to qualitative research.

Among qualitative researchers, the dilemma of governing the measures to assess the quality of research is not a new phenomenon, especially when the virtuous triad of objectivity, reliability, and validity (Spencer et al., 2004 ) are not adequate. Occasionally, the criteria of quantitative research are used to evaluate qualitative research (Cohen & Crabtree, 2008 ; Lather, 2004 ). Indeed, Howe ( 2004 ) claims that the prevailing paradigm in educational research is scientifically based experimental research. Hypotheses and conjectures about the preeminence of quantitative research can weaken the worth and usefulness of qualitative research by neglecting the prominence of harmonizing match for purpose on research paradigm, the epistemological stance of the researcher, and the choice of methodology. Researchers have been reprimanded concerning this in “paradigmatic controversies, contradictions, and emerging confluences” (Lincoln & Guba, 2000 ).

In general, qualitative research tends to come from a very different paradigmatic stance and intrinsically demands distinctive and out-of-the-ordinary criteria for evaluating good research and varieties of research contributions that can be made. This review attempts to present a series of evaluative criteria for qualitative researchers, arguing that their choice of criteria needs to be compatible with the unique nature of the research in question (its methodology, aims, and assumptions). This review aims to assist researchers in identifying some of the indispensable features or markers of high-quality qualitative research. In a nutshell, the purpose of this systematic literature review is to analyze the existing knowledge on high-quality qualitative research and to verify the existence of research studies dealing with the critical assessment of qualitative research based on the concept of diverse paradigmatic stances. Contrary to the existing reviews, this review also suggests some critical directions to follow to improve the quality of qualitative research in different epistemological and ontological perspectives. This review is also intended to provide guidelines for the acceleration of future developments and dialogues among qualitative researchers in the context of assessing the qualitative research.

The rest of this review article is structured in the following fashion: Sect.  Methods describes the method followed for performing this review. Section Criteria for Evaluating Qualitative Studies provides a comprehensive description of the criteria for evaluating qualitative studies. This section is followed by a summary of the strategies to improve the quality of qualitative research in Sect.  Improving Quality: Strategies . Section  How to Assess the Quality of the Research Findings? provides details on how to assess the quality of the research findings. After that, some of the quality checklists (as tools to evaluate quality) are discussed in Sect.  Quality Checklists: Tools for Assessing the Quality . At last, the review ends with the concluding remarks presented in Sect.  Conclusions, Future Directions and Outlook . Some prospects in qualitative research for enhancing its quality and usefulness in the social and techno-scientific research community are also presented in Sect.  Conclusions, Future Directions and Outlook .

For this review, a comprehensive literature search was performed from many databases using generic search terms such as Qualitative Research , Criteria , etc . The following databases were chosen for the literature search based on the high number of results: IEEE Explore, ScienceDirect, PubMed, Google Scholar, and Web of Science. The following keywords (and their combinations using Boolean connectives OR/AND) were adopted for the literature search: qualitative research, criteria, quality, assessment, and validity. The synonyms for these keywords were collected and arranged in a logical structure (see Table 1 ). All publications in journals and conference proceedings later than 1950 till 2021 were considered for the search. Other articles extracted from the references of the papers identified in the electronic search were also included. A large number of publications on qualitative research were retrieved during the initial screening. Hence, to include the searches with the main focus on criteria for good qualitative research, an inclusion criterion was utilized in the search string.

From the selected databases, the search retrieved a total of 765 publications. Then, the duplicate records were removed. After that, based on the title and abstract, the remaining 426 publications were screened for their relevance by using the following inclusion and exclusion criteria (see Table 2 ). Publications focusing on evaluation criteria for good qualitative research were included, whereas those works which delivered theoretical concepts on qualitative research were excluded. Based on the screening and eligibility, 45 research articles were identified that offered explicit criteria for evaluating the quality of qualitative research and were found to be relevant to this review.

Figure  1 illustrates the complete review process in the form of PRISMA flow diagram. PRISMA, i.e., “preferred reporting items for systematic reviews and meta-analyses” is employed in systematic reviews to refine the quality of reporting.

figure 1

PRISMA flow diagram illustrating the search and inclusion process. N represents the number of records

Criteria for Evaluating Qualitative Studies

Fundamental criteria: general research quality.

Various researchers have put forward criteria for evaluating qualitative research, which have been summarized in Table 3 . Also, the criteria outlined in Table 4 effectively deliver the various approaches to evaluate and assess the quality of qualitative work. The entries in Table 4 are based on Tracy’s “Eight big‐tent criteria for excellent qualitative research” (Tracy, 2010 ). Tracy argues that high-quality qualitative work should formulate criteria focusing on the worthiness, relevance, timeliness, significance, morality, and practicality of the research topic, and the ethical stance of the research itself. Researchers have also suggested a series of questions as guiding principles to assess the quality of a qualitative study (Mays & Pope, 2020 ). Nassaji ( 2020 ) argues that good qualitative research should be robust, well informed, and thoroughly documented.

Qualitative Research: Interpretive Paradigms

All qualitative researchers follow highly abstract principles which bring together beliefs about ontology, epistemology, and methodology. These beliefs govern how the researcher perceives and acts. The net, which encompasses the researcher’s epistemological, ontological, and methodological premises, is referred to as a paradigm, or an interpretive structure, a “Basic set of beliefs that guides action” (Guba, 1990 ). Four major interpretive paradigms structure the qualitative research: positivist and postpositivist, constructivist interpretive, critical (Marxist, emancipatory), and feminist poststructural. The complexity of these four abstract paradigms increases at the level of concrete, specific interpretive communities. Table 5 presents these paradigms and their assumptions, including their criteria for evaluating research, and the typical form that an interpretive or theoretical statement assumes in each paradigm. Moreover, for evaluating qualitative research, quantitative conceptualizations of reliability and validity are proven to be incompatible (Horsburgh, 2003 ). In addition, a series of questions have been put forward in the literature to assist a reviewer (who is proficient in qualitative methods) for meticulous assessment and endorsement of qualitative research (Morse, 2003 ). Hammersley ( 2007 ) also suggests that guiding principles for qualitative research are advantageous, but methodological pluralism should not be simply acknowledged for all qualitative approaches. Seale ( 1999 ) also points out the significance of methodological cognizance in research studies.

Table 5 reflects that criteria for assessing the quality of qualitative research are the aftermath of socio-institutional practices and existing paradigmatic standpoints. Owing to the paradigmatic diversity of qualitative research, a single set of quality criteria is neither possible nor desirable. Hence, the researchers must be reflexive about the criteria they use in the various roles they play within their research community.

Improving Quality: Strategies

Another critical question is “How can the qualitative researchers ensure that the abovementioned quality criteria can be met?” Lincoln and Guba ( 1986 ) delineated several strategies to intensify each criteria of trustworthiness. Other researchers (Merriam & Tisdell, 2016 ; Shenton, 2004 ) also presented such strategies. A brief description of these strategies is shown in Table 6 .

It is worth mentioning that generalizability is also an integral part of qualitative research (Hays & McKibben, 2021 ). In general, the guiding principle pertaining to generalizability speaks about inducing and comprehending knowledge to synthesize interpretive components of an underlying context. Table 7 summarizes the main metasynthesis steps required to ascertain generalizability in qualitative research.

Figure  2 reflects the crucial components of a conceptual framework and their contribution to decisions regarding research design, implementation, and applications of results to future thinking, study, and practice (Johnson et al., 2020 ). The synergy and interrelationship of these components signifies their role to different stances of a qualitative research study.

figure 2

Essential elements of a conceptual framework

In a nutshell, to assess the rationale of a study, its conceptual framework and research question(s), quality criteria must take account of the following: lucid context for the problem statement in the introduction; well-articulated research problems and questions; precise conceptual framework; distinct research purpose; and clear presentation and investigation of the paradigms. These criteria would expedite the quality of qualitative research.

How to Assess the Quality of the Research Findings?

The inclusion of quotes or similar research data enhances the confirmability in the write-up of the findings. The use of expressions (for instance, “80% of all respondents agreed that” or “only one of the interviewees mentioned that”) may also quantify qualitative findings (Stenfors et al., 2020 ). On the other hand, the persuasive reason for “why this may not help in intensifying the research” has also been provided (Monrouxe & Rees, 2020 ). Further, the Discussion and Conclusion sections of an article also prove robust markers of high-quality qualitative research, as elucidated in Table 8 .

Quality Checklists: Tools for Assessing the Quality

Numerous checklists are available to speed up the assessment of the quality of qualitative research. However, if used uncritically and recklessly concerning the research context, these checklists may be counterproductive. I recommend that such lists and guiding principles may assist in pinpointing the markers of high-quality qualitative research. However, considering enormous variations in the authors’ theoretical and philosophical contexts, I would emphasize that high dependability on such checklists may say little about whether the findings can be applied in your setting. A combination of such checklists might be appropriate for novice researchers. Some of these checklists are listed below:

The most commonly used framework is Consolidated Criteria for Reporting Qualitative Research (COREQ) (Tong et al., 2007 ). This framework is recommended by some journals to be followed by the authors during article submission.

Standards for Reporting Qualitative Research (SRQR) is another checklist that has been created particularly for medical education (O’Brien et al., 2014 ).

Also, Tracy ( 2010 ) and Critical Appraisal Skills Programme (CASP, 2021 ) offer criteria for qualitative research relevant across methods and approaches.

Further, researchers have also outlined different criteria as hallmarks of high-quality qualitative research. For instance, the “Road Trip Checklist” (Epp & Otnes, 2021 ) provides a quick reference to specific questions to address different elements of high-quality qualitative research.

Conclusions, Future Directions, and Outlook

This work presents a broad review of the criteria for good qualitative research. In addition, this article presents an exploratory analysis of the essential elements in qualitative research that can enable the readers of qualitative work to judge it as good research when objectively and adequately utilized. In this review, some of the essential markers that indicate high-quality qualitative research have been highlighted. I scope them narrowly to achieve rigor in qualitative research and note that they do not completely cover the broader considerations necessary for high-quality research. This review points out that a universal and versatile one-size-fits-all guideline for evaluating the quality of qualitative research does not exist. In other words, this review also emphasizes the non-existence of a set of common guidelines among qualitative researchers. In unison, this review reinforces that each qualitative approach should be treated uniquely on account of its own distinctive features for different epistemological and disciplinary positions. Owing to the sensitivity of the worth of qualitative research towards the specific context and the type of paradigmatic stance, researchers should themselves analyze what approaches can be and must be tailored to ensemble the distinct characteristics of the phenomenon under investigation. Although this article does not assert to put forward a magic bullet and to provide a one-stop solution for dealing with dilemmas about how, why, or whether to evaluate the “goodness” of qualitative research, it offers a platform to assist the researchers in improving their qualitative studies. This work provides an assembly of concerns to reflect on, a series of questions to ask, and multiple sets of criteria to look at, when attempting to determine the quality of qualitative research. Overall, this review underlines the crux of qualitative research and accentuates the need to evaluate such research by the very tenets of its being. Bringing together the vital arguments and delineating the requirements that good qualitative research should satisfy, this review strives to equip the researchers as well as reviewers to make well-versed judgment about the worth and significance of the qualitative research under scrutiny. In a nutshell, a comprehensive portrayal of the research process (from the context of research to the research objectives, research questions and design, speculative foundations, and from approaches of collecting data to analyzing the results, to deriving inferences) frequently proliferates the quality of a qualitative research.

Prospects : A Road Ahead for Qualitative Research

Irrefutably, qualitative research is a vivacious and evolving discipline wherein different epistemological and disciplinary positions have their own characteristics and importance. In addition, not surprisingly, owing to the sprouting and varied features of qualitative research, no consensus has been pulled off till date. Researchers have reflected various concerns and proposed several recommendations for editors and reviewers on conducting reviews of critical qualitative research (Levitt et al., 2021 ; McGinley et al., 2021 ). Following are some prospects and a few recommendations put forward towards the maturation of qualitative research and its quality evaluation:

In general, most of the manuscript and grant reviewers are not qualitative experts. Hence, it is more likely that they would prefer to adopt a broad set of criteria. However, researchers and reviewers need to keep in mind that it is inappropriate to utilize the same approaches and conducts among all qualitative research. Therefore, future work needs to focus on educating researchers and reviewers about the criteria to evaluate qualitative research from within the suitable theoretical and methodological context.

There is an urgent need to refurbish and augment critical assessment of some well-known and widely accepted tools (including checklists such as COREQ, SRQR) to interrogate their applicability on different aspects (along with their epistemological ramifications).

Efforts should be made towards creating more space for creativity, experimentation, and a dialogue between the diverse traditions of qualitative research. This would potentially help to avoid the enforcement of one's own set of quality criteria on the work carried out by others.

Moreover, journal reviewers need to be aware of various methodological practices and philosophical debates.

It is pivotal to highlight the expressions and considerations of qualitative researchers and bring them into a more open and transparent dialogue about assessing qualitative research in techno-scientific, academic, sociocultural, and political rooms.

Frequent debates on the use of evaluative criteria are required to solve some potentially resolved issues (including the applicability of a single set of criteria in multi-disciplinary aspects). Such debates would not only benefit the group of qualitative researchers themselves, but primarily assist in augmenting the well-being and vivacity of the entire discipline.

To conclude, I speculate that the criteria, and my perspective, may transfer to other methods, approaches, and contexts. I hope that they spark dialog and debate – about criteria for excellent qualitative research and the underpinnings of the discipline more broadly – and, therefore, help improve the quality of a qualitative study. Further, I anticipate that this review will assist the researchers to contemplate on the quality of their own research, to substantiate research design and help the reviewers to review qualitative research for journals. On a final note, I pinpoint the need to formulate a framework (encompassing the prerequisites of a qualitative study) by the cohesive efforts of qualitative researchers of different disciplines with different theoretic-paradigmatic origins. I believe that tailoring such a framework (of guiding principles) paves the way for qualitative researchers to consolidate the status of qualitative research in the wide-ranging open science debate. Dialogue on this issue across different approaches is crucial for the impending prospects of socio-techno-educational research.

Amin, M. E. K., Nørgaard, L. S., Cavaco, A. M., Witry, M. J., Hillman, L., Cernasev, A., & Desselle, S. P. (2020). Establishing trustworthiness and authenticity in qualitative pharmacy research. Research in Social and Administrative Pharmacy, 16 (10), 1472–1482.

Article   Google Scholar  

Barker, C., & Pistrang, N. (2005). Quality criteria under methodological pluralism: Implications for conducting and evaluating research. American Journal of Community Psychology, 35 (3–4), 201–212.

Bryman, A., Becker, S., & Sempik, J. (2008). Quality criteria for quantitative, qualitative and mixed methods research: A view from social policy. International Journal of Social Research Methodology, 11 (4), 261–276.

Caelli, K., Ray, L., & Mill, J. (2003). ‘Clear as mud’: Toward greater clarity in generic qualitative research. International Journal of Qualitative Methods, 2 (2), 1–13.

CASP (2021). CASP checklists. Retrieved May 2021 from https://casp-uk.net/casp-tools-checklists/

Cohen, D. J., & Crabtree, B. F. (2008). Evaluative criteria for qualitative research in health care: Controversies and recommendations. The Annals of Family Medicine, 6 (4), 331–339.

Denzin, N. K., & Lincoln, Y. S. (2005). Introduction: The discipline and practice of qualitative research. In N. K. Denzin & Y. S. Lincoln (Eds.), The sage handbook of qualitative research (pp. 1–32). Sage Publications Ltd.

Google Scholar  

Elliott, R., Fischer, C. T., & Rennie, D. L. (1999). Evolving guidelines for publication of qualitative research studies in psychology and related fields. British Journal of Clinical Psychology, 38 (3), 215–229.

Epp, A. M., & Otnes, C. C. (2021). High-quality qualitative research: Getting into gear. Journal of Service Research . https://doi.org/10.1177/1094670520961445

Guba, E. G. (1990). The paradigm dialog. In Alternative paradigms conference, mar, 1989, Indiana u, school of education, San Francisco, ca, us . Sage Publications, Inc.

Hammersley, M. (2007). The issue of quality in qualitative research. International Journal of Research and Method in Education, 30 (3), 287–305.

Haven, T. L., Errington, T. M., Gleditsch, K. S., van Grootel, L., Jacobs, A. M., Kern, F. G., & Mokkink, L. B. (2020). Preregistering qualitative research: A Delphi study. International Journal of Qualitative Methods, 19 , 1609406920976417.

Hays, D. G., & McKibben, W. B. (2021). Promoting rigorous research: Generalizability and qualitative research. Journal of Counseling and Development, 99 (2), 178–188.

Horsburgh, D. (2003). Evaluation of qualitative research. Journal of Clinical Nursing, 12 (2), 307–312.

Howe, K. R. (2004). A critique of experimentalism. Qualitative Inquiry, 10 (1), 42–46.

Johnson, J. L., Adkins, D., & Chauvin, S. (2020). A review of the quality indicators of rigor in qualitative research. American Journal of Pharmaceutical Education, 84 (1), 7120.

Johnson, P., Buehring, A., Cassell, C., & Symon, G. (2006). Evaluating qualitative management research: Towards a contingent criteriology. International Journal of Management Reviews, 8 (3), 131–156.

Klein, H. K., & Myers, M. D. (1999). A set of principles for conducting and evaluating interpretive field studies in information systems. MIS Quarterly, 23 (1), 67–93.

Lather, P. (2004). This is your father’s paradigm: Government intrusion and the case of qualitative research in education. Qualitative Inquiry, 10 (1), 15–34.

Levitt, H. M., Morrill, Z., Collins, K. M., & Rizo, J. L. (2021). The methodological integrity of critical qualitative research: Principles to support design and research review. Journal of Counseling Psychology, 68 (3), 357.

Lincoln, Y. S., & Guba, E. G. (1986). But is it rigorous? Trustworthiness and authenticity in naturalistic evaluation. New Directions for Program Evaluation, 1986 (30), 73–84.

Lincoln, Y. S., & Guba, E. G. (2000). Paradigmatic controversies, contradictions and emerging confluences. In N. K. Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research (2nd ed., pp. 163–188). Sage Publications.

Madill, A., Jordan, A., & Shirley, C. (2000). Objectivity and reliability in qualitative analysis: Realist, contextualist and radical constructionist epistemologies. British Journal of Psychology, 91 (1), 1–20.

Mays, N., & Pope, C. (2020). Quality in qualitative research. Qualitative Research in Health Care . https://doi.org/10.1002/9781119410867.ch15

McGinley, S., Wei, W., Zhang, L., & Zheng, Y. (2021). The state of qualitative research in hospitality: A 5-year review 2014 to 2019. Cornell Hospitality Quarterly, 62 (1), 8–20.

Merriam, S., & Tisdell, E. (2016). Qualitative research: A guide to design and implementation. San Francisco, US.

Meyer, M., & Dykes, J. (2019). Criteria for rigor in visualization design study. IEEE Transactions on Visualization and Computer Graphics, 26 (1), 87–97.

Monrouxe, L. V., & Rees, C. E. (2020). When I say… quantification in qualitative research. Medical Education, 54 (3), 186–187.

Morrow, S. L. (2005). Quality and trustworthiness in qualitative research in counseling psychology. Journal of Counseling Psychology, 52 (2), 250.

Morse, J. M. (2003). A review committee’s guide for evaluating qualitative proposals. Qualitative Health Research, 13 (6), 833–851.

Nassaji, H. (2020). Good qualitative research. Language Teaching Research, 24 (4), 427–431.

O’Brien, B. C., Harris, I. B., Beckman, T. J., Reed, D. A., & Cook, D. A. (2014). Standards for reporting qualitative research: A synthesis of recommendations. Academic Medicine, 89 (9), 1245–1251.

O’Connor, C., & Joffe, H. (2020). Intercoder reliability in qualitative research: Debates and practical guidelines. International Journal of Qualitative Methods, 19 , 1609406919899220.

Reid, A., & Gough, S. (2000). Guidelines for reporting and evaluating qualitative research: What are the alternatives? Environmental Education Research, 6 (1), 59–91.

Rocco, T. S. (2010). Criteria for evaluating qualitative studies. Human Resource Development International . https://doi.org/10.1080/13678868.2010.501959

Sandberg, J. (2000). Understanding human competence at work: An interpretative approach. Academy of Management Journal, 43 (1), 9–25.

Schwandt, T. A. (1996). Farewell to criteriology. Qualitative Inquiry, 2 (1), 58–72.

Seale, C. (1999). Quality in qualitative research. Qualitative Inquiry, 5 (4), 465–478.

Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22 (2), 63–75.

Sparkes, A. C. (2001). Myth 94: Qualitative health researchers will agree about validity. Qualitative Health Research, 11 (4), 538–552.

Spencer, L., Ritchie, J., Lewis, J., & Dillon, L. (2004). Quality in qualitative evaluation: A framework for assessing research evidence.

Stenfors, T., Kajamaa, A., & Bennett, D. (2020). How to assess the quality of qualitative research. The Clinical Teacher, 17 (6), 596–599.

Taylor, E. W., Beck, J., & Ainsworth, E. (2001). Publishing qualitative adult education research: A peer review perspective. Studies in the Education of Adults, 33 (2), 163–179.

Tong, A., Sainsbury, P., & Craig, J. (2007). Consolidated criteria for reporting qualitative research (COREQ): A 32-item checklist for interviews and focus groups. International Journal for Quality in Health Care, 19 (6), 349–357.

Tracy, S. J. (2010). Qualitative quality: Eight “big-tent” criteria for excellent qualitative research. Qualitative Inquiry, 16 (10), 837–851.

Download references

Open access funding provided by TU Wien (TUW).

Author information

Authors and affiliations.

Faculty of Informatics, Technische Universität Wien, 1040, Vienna, Austria

Drishti Yadav

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Drishti Yadav .

Ethics declarations

Conflict of interest.

The author declares no conflict of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Yadav, D. Criteria for Good Qualitative Research: A Comprehensive Review. Asia-Pacific Edu Res 31 , 679–689 (2022). https://doi.org/10.1007/s40299-021-00619-0

Download citation

Accepted : 28 August 2021

Published : 18 September 2021

Issue Date : December 2022

DOI : https://doi.org/10.1007/s40299-021-00619-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Qualitative research
  • Evaluative criteria
  • Find a journal
  • Publish with us
  • Track your research
  • Privacy Policy
  • SignUp/Login

Research Method

Home » Documentary Analysis – Methods, Applications and Examples

Documentary Analysis – Methods, Applications and Examples

Table of Contents

Documentary Analysis

Documentary Analysis

Definition:

Documentary analysis, also referred to as document analysis , is a systematic procedure for reviewing or evaluating documents. This method involves a detailed review of the documents to extract themes or patterns relevant to the research topic .

Documents used in this type of analysis can include a wide variety of materials such as text (words) and images that have been recorded without a researcher’s intervention. The domain of document analysis, therefore, includes all kinds of texts – books, newspapers, letters, study reports, diaries, and more, as well as images like maps, photographs, and films.

Documentary analysis provides valuable insight and a unique perspective on the past, contextualizing the present and providing a baseline for future studies. It is also an essential tool in case studies and when direct observation or participant observation is not possible.

The process usually involves several steps:

  • Sourcing : This involves identifying the document or source, its origin, and the context in which it was created.
  • Contextualizing : This involves understanding the social, economic, political, and cultural circumstances during the time the document was created.
  • Interrogating : This involves asking a series of questions to help understand the document better. For example, who is the author? What is the purpose of the document? Who is the intended audience?
  • Making inferences : This involves understanding what the document says (either directly or indirectly) about the topic under study.
  • Checking for reliability and validity : Just like other research methods, documentary analysis also involves checking for the validity and reliability of the documents being analyzed.

Documentary Analysis Methods

Documentary analysis as a qualitative research method involves a systematic process. Here are the main steps you would generally follow:

Defining the Research Question

Before you start any research , you need a clear and focused research question . This will guide your decision on what documents you need to analyze and what you’re looking for within them.

Selecting the Documents

Once you know what you’re looking for, you can start to select the relevant documents. These can be a wide range of materials – books, newspapers, letters, official reports, diaries, transcripts of speeches, archival materials, websites, social media posts, and more. They can be primary sources (directly from the time/place/person you are studying) or secondary sources (analyses created by others).

Reading and Interpreting the Documents

You need to closely read the selected documents to identify the themes and patterns that relate to your research question. This might involve content analysis (looking at what is explicitly stated) and discourse analysis (looking at what is implicitly stated or implied). You need to understand the context in which the document was created, the author’s purpose, and the audience’s perspective.

Coding and Categorizing the Data

After the initial reading, the data (text) can be broken down into smaller parts or “codes.” These codes can then be categorized based on their similarities and differences. This process of coding helps in organizing the data and identifying patterns or themes.

Analyzing the Data

Once the data is organized, it can be analyzed to make sense of it. This can involve comparing the data with existing theories, examining relationships between categories, or explaining the data in relation to the research question.

Validating the Findings

The researcher needs to ensure that the findings are accurate and credible. This might involve triangulating the data (comparing it with other sources or types of data), considering alternative explanations, or seeking feedback from others.

Reporting the Findings

The final step is to report the findings in a clear, structured way. This should include a description of the methods used, the findings, and the researcher’s interpretations and conclusions.

Applications of Documentary Analysis

Documentary analysis is widely used across a variety of fields and disciplines due to its flexible and comprehensive nature. Here are some specific applications:

Historical Research

Documentary analysis is a fundamental method in historical research. Historians use documents to reconstruct past events, understand historical contexts, and interpret the motivations and actions of historical figures. Documents analyzed may include personal letters, diaries, official records, newspaper articles, photographs, and more.

Social Science Research

Sociologists, anthropologists, and political scientists use documentary analysis to understand social phenomena, cultural practices, political events, and more. This might involve analyzing government policies, organizational records, media reports, social media posts, and other documents.

Legal Research

In law, documentary analysis is used in case analysis and statutory interpretation. Legal practitioners and scholars analyze court decisions, statutes, regulations, and other legal documents.

Business and Market Research

Companies often analyze documents to gather business intelligence, understand market trends, and make strategic decisions. This might involve analyzing competitor reports, industry news, market research studies, and more.

Media and Communication Studies

Scholars in these fields might analyze media content (e.g., news reports, advertisements, social media posts) to understand media narratives, public opinion, and communication practices.

Literary and Film Studies

In these fields, the “documents” might be novels, poems, films, or scripts. Scholars analyze these texts to interpret their meaning, understand their cultural context, and critique their form and content.

Educational Research

Educational researchers may analyze curricula, textbooks, lesson plans, and other educational documents to understand educational practices and policies.

Health Research

Health researchers may analyze medical records, health policies, clinical guidelines, and other documents to study health behaviors, healthcare delivery, and health outcomes.

Examples of Documentary Analysis

Some Examples of Documentary Analysis might be:

  • Example 1 : A historian studying the causes of World War I might analyze diplomatic correspondence, government records, newspaper articles, and personal diaries from the period leading up to the war.
  • Example 2 : A policy analyst trying to understand the impact of a new public health policy might analyze the policy document itself, as well as related government reports, statements from public health officials, and news media coverage of the policy.
  • Example 3 : A market researcher studying consumer trends might analyze social media posts, customer reviews, industry reports, and news articles related to the market they’re studying.
  • Example 4 : An education researcher might analyze curriculum documents, textbooks, and lesson plans to understand how a particular subject is being taught in schools. They might also analyze policy documents to understand the broader educational policy context.
  • Example 5 : A criminologist studying hate crimes might analyze police reports, court records, news reports, and social media posts to understand patterns in hate crimes, as well as societal and institutional responses to them.
  • Example 6 : A journalist writing a feature article on homelessness might analyze government reports on homelessness, policy documents related to housing and social services, news articles on homelessness, and social media posts from people experiencing homelessness.
  • Example 7 : A literary critic studying a particular author might analyze their novels, letters, interviews, and reviews of their work to gain insight into their themes, writing style, influences, and reception.

When to use Documentary Analysis

Documentary analysis can be used in a variety of research contexts, including but not limited to:

  • When direct access to research subjects is limited : If you are unable to conduct interviews or observations due to geographical, logistical, or ethical constraints, documentary analysis can provide an alternative source of data.
  • When studying the past : Documents can provide a valuable window into historical events, cultures, and perspectives. This is particularly useful when the people involved in these events are no longer available for interviews or when physical evidence is lacking.
  • When corroborating other sources of data : If you have collected data through interviews, surveys, or observations, analyzing documents can provide additional evidence to support or challenge your findings. This process of triangulation can enhance the validity of your research.
  • When seeking to understand the context : Documents can provide background information that helps situate your research within a broader social, cultural, historical, or institutional context. This can be important for interpreting your other data and for making your research relevant to a wider audience.
  • When the documents are the focus of the research : In some cases, the documents themselves might be the subject of your research. For example, you might be studying how a particular topic is represented in the media, how an author’s work has evolved over time, or how a government policy was developed.
  • When resources are limited : Compared to methods like experiments or large-scale surveys, documentary analysis can often be conducted with relatively limited resources. It can be a particularly useful method for students, independent researchers, and others who are working with tight budgets.
  • When providing an audit trail for future researchers : Documents provide a record of events, decisions, or conditions at specific points in time. They can serve as an audit trail for future researchers who want to understand the circumstances surrounding a particular event or period.

Purpose of Documentary Analysis

The purpose of documentary analysis in research can be multifold. Here are some key reasons why a researcher might choose to use this method:

  • Understanding Context : Documents can provide rich contextual information about the period, environment, or culture under investigation. This can be especially useful for historical research, where the context is often key to understanding the events or trends being studied.
  • Direct Source of Data : Documents can serve as primary sources of data. For instance, a letter from a historical figure can give unique insights into their thoughts, feelings, and motivations. A company’s annual report can offer firsthand information about its performance and strategy.
  • Corroboration and Verification : Documentary analysis can be used to validate and cross-verify findings derived from other research methods. For example, if interviews suggest a particular outcome, relevant documents can be reviewed to confirm the accuracy of this finding.
  • Substituting for Other Methods : When access to the field or subjects is not possible due to various constraints (geographical, logistical, or ethical), documentary analysis can serve as an alternative to methods like observation or interviews.
  • Unobtrusive Method : Unlike some other research methods, documentary analysis doesn’t require interaction with subjects, and therefore doesn’t risk altering the behavior of those subjects.
  • Longitudinal Analysis : Documents can be used to study change over time. For example, a researcher might analyze census data from multiple decades to study demographic changes.
  • Providing Rich, Qualitative Data : Documents often provide qualitative data that can help researchers understand complex issues in depth. For example, a policy document might reveal not just the details of the policy, but also the underlying beliefs and attitudes that shaped it.

Advantages of Documentary Analysis

Documentary analysis offers several advantages as a research method:

  • Unobtrusive : As a non-reactive method, documentary analysis does not require direct interaction with human subjects, which means that the research doesn’t affect or influence the subjects’ behavior.
  • Rich Historical and Contextual Data : Documents can provide a wealth of historical and contextual information. They allow researchers to examine events and perspectives from the past, even from periods long before modern research methods were established.
  • Efficiency and Accessibility : Many documents are readily accessible, especially with the proliferation of digital archives and databases. This accessibility can often make documentary analysis a more efficient method than others that require data collection from human subjects.
  • Cost-Effective : Compared to other methods, documentary analysis can be relatively inexpensive. It generally requires fewer resources than conducting experiments, surveys, or fieldwork.
  • Permanent Record : Documents provide a permanent record that can be reviewed multiple times. This allows for repeated analysis and verification of the data.
  • Versatility : A wide variety of documents can be analyzed, from historical texts to contemporary digital content, providing flexibility and applicability to a broad range of research questions and fields.
  • Ability to Cross-Verify (Triangulate) Data : Documentary analysis can be used alongside other methods as a means of triangulating data, thus adding validity and reliability to the research.

Limitations of Documentary Analysis

While documentary analysis offers several benefits as a research method, it also has its limitations. It’s important to keep these in mind when deciding to use documentary analysis and when interpreting your findings:

  • Authenticity : Not all documents are genuine, and sometimes it can be challenging to verify the authenticity of a document, particularly for historical research.
  • Bias and Subjectivity : All documents are products of their time and their authors. They may reflect personal, cultural, political, or institutional biases, and these biases can affect the information they contain and how it is presented.
  • Incomplete or Missing Information : Documents may not provide all the information you need for your research. There may be gaps in the record, or crucial information may have been omitted, intentionally or unintentionally.
  • Access and Availability : Not all documents are readily available for analysis. Some may be restricted due to privacy, confidentiality, or security considerations. Others may be difficult to locate or access, particularly historical documents that haven’t been digitized.
  • Interpretation : Interpreting documents, particularly historical ones, can be challenging. You need to understand the context in which the document was created, including the social, cultural, political, and personal factors that might have influenced its content.
  • Time-Consuming : While documentary analysis can be cost-effective, it can also be time-consuming, especially if you have a large number of documents to analyze or if the documents are lengthy or complex.
  • Lack of Control Over Data : Unlike methods where the researcher collects the data themselves (e.g., through experiments or surveys), with documentary analysis, you have no control over what data is available. You are reliant on what others have chosen to record and preserve.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Probability Histogram

Probability Histogram – Definition, Examples and...

Substantive Framework

Substantive Framework – Types, Methods and...

Factor Analysis

Factor Analysis – Steps, Methods and Examples

Graphical Methods

Graphical Methods – Types, Examples and Guide

Critical Analysis

Critical Analysis – Types, Examples and Writing...

Grounded Theory

Grounded Theory – Methods, Examples and Guide

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • What Is Qualitative Research? | Methods & Examples

What Is Qualitative Research? | Methods & Examples

Published on June 19, 2020 by Pritha Bhandari . Revised on June 22, 2023.

Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research.

Qualitative research is the opposite of quantitative research , which involves collecting and analyzing numerical data for statistical analysis.

Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, history, etc.

  • How does social media shape body image in teenagers?
  • How do children and adults interpret healthy eating in the UK?
  • What factors influence employee retention in a large organization?
  • How is anxiety experienced around the world?
  • How can teachers integrate social issues into science curriculums?

Table of contents

Approaches to qualitative research, qualitative research methods, qualitative data analysis, advantages of qualitative research, disadvantages of qualitative research, other interesting articles, frequently asked questions about qualitative research.

Qualitative research is used to understand how people experience the world. While there are many approaches to qualitative research, they tend to be flexible and focus on retaining rich meaning when interpreting data.

Common approaches include grounded theory, ethnography , action research , phenomenological research, and narrative research. They share some similarities, but emphasize different aims and perspectives.

Note that qualitative research is at risk for certain research biases including the Hawthorne effect , observer bias , recall bias , and social desirability bias . While not always totally avoidable, awareness of potential biases as you collect and analyze your data can prevent them from impacting your work too much.

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Each of the research approaches involve using one or more data collection methods . These are some of the most common qualitative methods:

  • Observations: recording what you have seen, heard, or encountered in detailed field notes.
  • Interviews:  personally asking people questions in one-on-one conversations.
  • Focus groups: asking questions and generating discussion among a group of people.
  • Surveys : distributing questionnaires with open-ended questions.
  • Secondary research: collecting existing data in the form of texts, images, audio or video recordings, etc.
  • You take field notes with observations and reflect on your own experiences of the company culture.
  • You distribute open-ended surveys to employees across all the company’s offices by email to find out if the culture varies across locations.
  • You conduct in-depth interviews with employees in your office to learn about their experiences and perspectives in greater detail.

Qualitative researchers often consider themselves “instruments” in research because all observations, interpretations and analyses are filtered through their own personal lens.

For this reason, when writing up your methodology for qualitative research, it’s important to reflect on your approach and to thoroughly explain the choices you made in collecting and analyzing the data.

Qualitative data can take the form of texts, photos, videos and audio. For example, you might be working with interview transcripts, survey responses, fieldnotes, or recordings from natural settings.

Most types of qualitative data analysis share the same five steps:

  • Prepare and organize your data. This may mean transcribing interviews or typing up fieldnotes.
  • Review and explore your data. Examine the data for patterns or repeated ideas that emerge.
  • Develop a data coding system. Based on your initial ideas, establish a set of codes that you can apply to categorize your data.
  • Assign codes to the data. For example, in qualitative survey analysis, this may mean going through each participant’s responses and tagging them with codes in a spreadsheet. As you go through your data, you can create new codes to add to your system if necessary.
  • Identify recurring themes. Link codes together into cohesive, overarching themes.

There are several specific approaches to analyzing qualitative data. Although these methods share similar processes, they emphasize different concepts.

Qualitative research often tries to preserve the voice and perspective of participants and can be adjusted as new research questions arise. Qualitative research is good for:

  • Flexibility

The data collection and analysis process can be adapted as new ideas or patterns emerge. They are not rigidly decided beforehand.

  • Natural settings

Data collection occurs in real-world contexts or in naturalistic ways.

  • Meaningful insights

Detailed descriptions of people’s experiences, feelings and perceptions can be used in designing, testing or improving systems or products.

  • Generation of new ideas

Open-ended responses mean that researchers can uncover novel problems or opportunities that they wouldn’t have thought of otherwise.

Researchers must consider practical and theoretical limitations in analyzing and interpreting their data. Qualitative research suffers from:

  • Unreliability

The real-world setting often makes qualitative research unreliable because of uncontrolled factors that affect the data.

  • Subjectivity

Due to the researcher’s primary role in analyzing and interpreting data, qualitative research cannot be replicated . The researcher decides what is important and what is irrelevant in data analysis, so interpretations of the same data can vary greatly.

  • Limited generalizability

Small samples are often used to gather detailed data about specific contexts. Despite rigorous analysis procedures, it is difficult to draw generalizable conclusions because the data may be biased and unrepresentative of the wider population .

  • Labor-intensive

Although software can be used to manage and record large amounts of text, data analysis often has to be checked or performed manually.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Chi square goodness of fit test
  • Degrees of freedom
  • Null hypothesis
  • Discourse analysis
  • Control groups
  • Mixed methods research
  • Non-probability sampling
  • Quantitative research
  • Inclusion and exclusion criteria

Research bias

  • Rosenthal effect
  • Implicit bias
  • Cognitive bias
  • Selection bias
  • Negativity bias
  • Status quo bias

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

There are five common approaches to qualitative research :

  • Grounded theory involves collecting data in order to develop new theories.
  • Ethnography involves immersing yourself in a group or organization to understand its culture.
  • Narrative research involves interpreting stories to understand how people make sense of their experiences and perceptions.
  • Phenomenological research involves investigating phenomena through people’s lived experiences.
  • Action research links theory and practice in several cycles to drive innovative changes.

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organizations.

There are various approaches to qualitative data analysis , but they all share five steps in common:

  • Prepare and organize your data.
  • Review and explore your data.
  • Develop a data coding system.
  • Assign codes to the data.
  • Identify recurring themes.

The specifics of each step depend on the focus of the analysis. Some common approaches include textual analysis , thematic analysis , and discourse analysis .

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2023, June 22). What Is Qualitative Research? | Methods & Examples. Scribbr. Retrieved February 19, 2024, from https://www.scribbr.com/methodology/qualitative-research/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, how to do thematic analysis | step-by-step guide & examples, what is your plagiarism score.

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • BMJ Journals More You are viewing from: Google Indexer

You are here

  • Online First
  • Rapid reviews methods series: guidance on rapid qualitative evidence synthesis
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • http://orcid.org/0000-0003-4808-3880 Andrew Booth 1 , 2 ,
  • Isolde Sommer 3 , 4 ,
  • Jane Noyes 2 , 5 ,
  • Catherine Houghton 2 , 6 ,
  • Fiona Campbell 1 , 7
  • The Cochrane Rapid Reviews Methods Group and Cochrane Qualitative and Implementation Methods Group (CQIMG)
  • 1 EnSyGN Sheffield Evidence Synthesis Group , University of Sheffield , Sheffield , UK
  • 2 Cochrane Qualitative and Implementation Methods Group (CQIMG) , London , UK
  • 3 Department for Evidence-based Medicine and Evaluation , University for Continuing Education Krems , Krems , Austria
  • 4 Cochrane Rapid Reviews Group & Cochrane Austria , Krems , Austria
  • 5 Bangor University , Bangor , UK
  • 6 University of Galway , Galway , Ireland
  • 7 University of Newcastle upon Tyne , Newcastle upon Tyne , UK
  • Correspondence to Professor Andrew Booth, Univ Sheffield, Sheffield, UK; a.booth{at}sheffield.ac.uk

This paper forms part of a series of methodological guidance from the Cochrane Rapid Reviews Methods Group and addresses rapid qualitative evidence syntheses (QESs), which use modified systematic, transparent and reproducible methodsu to accelerate the synthesis of qualitative evidence when faced with resource constraints. This guidance covers the review process as it relates to synthesis of qualitative research. ‘Rapid’ or ‘resource-constrained’ QES require use of templates and targeted knowledge user involvement. Clear definition of perspectives and decisions on indirect evidence, sampling and use of existing QES help in targeting eligibility criteria. Involvement of an information specialist, especially in prioritising databases, targeting grey literature and planning supplemental searches, can prove invaluable. Use of templates and frameworks in study selection and data extraction can be accompanied by quality assurance procedures targeting areas of likely weakness. Current Cochrane guidance informs selection of tools for quality assessment and of synthesis method. Thematic and framework synthesis facilitate efficient synthesis of large numbers of studies or plentiful data. Finally, judicious use of Grading of Recommendations Assessment, Development and Evaluation approach for assessing the Confidence of Evidence from Reviews of Qualitative research assessments and of software as appropriate help to achieve a timely and useful review product.

  • Systematic Reviews as Topic
  • Patient Care

Data availability statement

No data are available. Not applicable. All data is from published articles.

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See:  http://creativecommons.org/licenses/by-nc/4.0/ .

https://doi.org/10.1136/bmjebm-2023-112620

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

WHAT IS ALREADY KNOWN ON THIS TOPIC

Rapid Qualitative Evidence Synthesis (QES) is a relatively recent innovation in evidence synthesis and few published examples currently exists.

Guidance for authoring a rapid QES is scattered and requires compilation and summary.

WHAT THIS STUDY ADDS

This paper represents the first attempt to compile current guidance, illustrated by the experience of several international review teams.

We identify features of rapid QES methods that could be accelerated or abbreviated and where methods resemble those for conventional QESs.

HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICY

This paper offers guidance for researchers when conducting a rapid QES and informs commissioners of research and policy-makers what to expect when commissioning such a review.

Introduction

This paper forms part of a series from the Cochrane Rapid Reviews Methods Group providing methodological guidance for rapid reviews. While other papers in the series 1–4 focus on generic considerations, we aim to provide in-depth recommendations specific to a resource-constrained (or rapid) qualitative evidence synthesis (rQES). 5 This paper is accompanied by recommended resources ( online supplemental appendix A ) and an elaboration with practical considerations ( online supplemental appendix B ).

Supplemental material

The role of qualitative evidence in decision-making is increasingly recognised. 6 This, in turn, has led to appreciation of the value of qualitative evidence syntheses (QESs) that summarise findings across multiple contexts. 7 Recognition of the need for such syntheses to be available at the time most useful to decision-making has, in turn, driven demand for rapid qualitative evidence syntheses. 8 The breadth of potential rQES mirrors the versatility of QES in general (from focused questions to broad overviews) and outputs range from descriptive thematic maps through to theory-informed syntheses (see table 1 ).

  • View inline

Glossary of important terms (alphabetically)

As with other resource-constrained reviews, no one size fits all. A team should start by specifying the phenomenon of interest, the review question, 9 the perspectives to be included 9 and the sample to be determined and selected. 10 Subsequently, the team must finalise the appropriate choice of synthesis. 11 Above all, the review team should consider the intended knowledge users, 3 including requirements of the funder.

An rQES team, in particular, cannot afford any extra time or resource requirements that might arise from either a misunderstanding of the review question, an unclear picture of user requirements or an inappropriate choice of methods. The team seeks to align the review question and the requirements of the knowledge user with available time and resources. They also need to ensure that the choice of data and choice of synthesis are appropriate to the intended ‘knowledge claims’ (epistemology) made by the rQES. 11 This involves the team asking ‘what types of data are meaningful for this review question?’, ‘what types of data are trustworthy?’ and ‘is the favoured synthesis method appropriate for this type of data?’. 12 This paper aims to help rQES teams to choose methods that best fit their project while understanding the limitations of those choices. Our recommendations derive from current QES guidance, 5 evidence on modified QES methods, 8 13 and practical experience. 14 15

This paper presents an overview of considerations and recommendations as described in table 2 . Supplemental materials including additional resources details of our recommendations and practical examples are provided in online supplemental appendices A and B .

Recommendations for resource-constrained qualitative evidence synthesis (rQES)

Setting the review question and topic refinement

Rapid reviews summarise information from multiple research studies to produce evidence for ‘the public, researchers, policymakers and funders in a systematic, resource-efficient manner’. 16 Involvement of knowledge users is critical. 3 Given time constraints, individual knowledge users could be asked only to feedback on very specific decisions and tasks or on selective sections of the protocol. Specifically, whenever a QES is abbreviated or accelerated, a team should ensure that the review question is agreed by a minimum number of knowledge users with expertise or experience that reflects all the important review perspectives and with authority to approve the final version 2 5 11 ( table 2 , item R1).

Involvement of topic experts can ensure that the rQES is responsive to need. 14 17 One Cochrane rQES saved considerable time by agreeing the review topic within a single meeting and one-phase iteration. 9 Decisions on topics to be omitted are also informed by a knowledge of existing QESs. 17

An information specialist can help to manage the quantity and quality of available evidence by setting conceptual boundaries and logistic limits. A structured question format, such as Setting-Perspective-Interest, phenomenon of-Comparison-Evaluation or Population-Interest, phenomenon of-Context helps in communicating the scope and, subsequently, in operationalising study selection. 9 18

Scoping (of review parameters) and mapping (of key types of evidence and likely richness of data) helps when planning the review. 5 19 The option to choose purposive sampling over comprehensive sampling approaches, as offered by standard QES, may be particularly helpful in the context of a rapid QES. 8 Once a team knows the approximate number and distribution of studies, perhaps mapping them against country, age, ethnicity, etc), they can decide whether or not to use purposive sampling. 12 An rQES for the WHO combined purposive with variation sampling. Sampling in two stages started by reducing the initial number of studies to a more manageable sampling frame and then sampling approximately a third of the remaining studies from within the sampling frame. 20

Sampling may target richer studies and/or privilege diversity. 8 21 A rich qualitative study typically illustrates findings with verbatim extracts from transcripts from interviews or textual responses from questionnaires. Rich studies are often found in specialist qualitative research or social science journals. In contrast, less rich studies may itemise themes with an occasional indicative text extract and tend to summarise findings. In clinical or biomedical journals less rich findings may be placed within a single table or box.

No rule exists on an optimal number of studies; too many studies makes it challenging to ‘maintain insight’, 22 too few does not sustain rigorous analysis. 23 Guidance on sampling is available from the forthcoming Cochrane-Campbell QES Handbook.

A review team can use templates to fast-track writing of a protocol. The protocol should always be publicly available ( table 2 , item R2). 24 25 Formal registration may require that the team has not commenced data extraction but should be considered if it does not compromise the rQES timeframe. Time pressures may require that methods are left suitably flexible to allow well-justified changes to be made as a detailed picture of the studies and data emerge. 26 The first Cochrane rQES drew heavily on text from a joint protocol/review template previously produced within Cochrane. 24

Setting eligibility criteria

An rQES team may need to limit the number of perspectives, focusing on those most important for decision-making 5 9 27 ( table 2 , item R3). Beyond the patients/clients each additional perspective (eg, family members, health professionals, other professionals, etc) multiplies the additional effort involved.

A rapid QES may require strict date and setting restrictions 17 and language restrictions that accommodate the specific requirements of the review. Specifically, the team should consider whether changes in context over time or substantive differences between geographical regions could be used to justify a narrower date range or a limited coverage of countries and/or languages. The team should also decide if ‘indirect evidence’ is to substitute for the absence of direct evidence. An rQES typically focuses on direct evidence, except when only indirect evidence is available 28 ( table 2 , item R4). Decisions on relevance are challenging—precautions for swine influenza may inform precautions for bird influenza. 28 A smoking ban may operate similarly to seat belt legislation, etc. A review team should identify where such shared mechanisms might operate. 28 An rQES team must also decide whether to use frameworks or models to focus the review. Theories may be unearthed within the topic search or be already known to team members, fro example, Theory of Planned Behaviour. 29

Options for managing the quantity and quality of studies and data emerge during the scoping (see above). In summary, the review team should consider privileging rich qualitative studies 2 ; consider a stepwise approach to inclusion of qualitative data and explore the possibility of sampling ( table 2 , item R5). For example, where data is plentiful an rQES may be limited to qualitative research and/or to mixed methods studies. Where data is less plentiful then surveys or other qualitative data sources may need to be included. Where plentiful reviews already exist, a team may decide to conduct a review of reviews 5 by including multiple QES within a mega-synthesis 28 29 ( table 2 , item R6).

Searching for QES merits its own guidance, 21–23 30 this section reinforces important considerations from guidance specific to qualitative research. Generic guidance for rapid reviews in this series broadly applies to rapid QESs. 1

In addition to journal articles, by far the most plentiful source, qualitative research is found in book chapters, theses and in published and unpublished reports. 21 Searches to support an rQES can (a) limit the number of databases searched, deliberately selecting databases from diverse disciplines, (b) use abbreviated study filters to retrieve qualitative designs and (c) employ high yield complementary methods (eg, reference checking, citation searching and Related Articles features). An information specialist (eg, librarian) should be involved in prioritising sources and search methods ( table 2 , item R7). 11 14

According to empirical evidence optimal database combinations include Scopus plus CINAHL or Scopus plus ProQuest Dissertations and Theses Global (two-database combinations) and Scopus plus CINAHL plus ProQuest Dissertations and Theses Global (three-database combination) with both choices retrieving between 89% and 92% of relevant studies. 30

If resources allow, searches should include one or two specialised databases ( table 2 , item R8) from different disciplines or contexts 21 (eg, social science databases, specialist discipline databases or regional or institutional repositories). Even when resources are limited, the information specialist should factor in time for peer review of at least one search strategy ( table 2 , item R9). 31 Searches for ‘grey literature’ should selectively target appropriate types of grey literature (such as theses or process evaluations) and supplemental searches, including citation chaining or Related Articles features ( table 2 , item R10). 32 The first Cochrane rQES reported that searching reference lists of key papers yielded an extra 30 candidate papers for review. However, the team documented exclusion of grey literature as a limitation of their review. 15

Study selection

Consistency in study selection is achieved by using templates, by gaining a shared team understanding of the audience and purpose, and by ongoing communication within, and beyond, the team. 2 33 Individuals may work in parallel on the same task, as in the first Cochrane rQES, or follow a ‘segmented’ approach where each reviewer is allocated a different task. 14 The use of machine learning in the specific context of rQES remains experimental. However, the possibility of developing qualitative study classifiers comparable to those for randomised controlled trials offers an achievable aspiration. 34

Title and abstract screening

The entire screening team should use pre-prepared, pretested title and abstract templates to limit the scale of piloting, calibration and testing ( table 2 , item R11). 1 14 The first Cochrane rQES team double-screened titles and abstracts within Covidence review software. 14 Disagreements were resolved with reference to a third reviewer achieving a shared understanding of the eligibility criteria and enhancing familiarity with target studies and insight from data. 14 The team should target and prioritise identified risks of either over-zealous inclusion or over-exclusion specific to each rQES ( table 2 , item R12). 14 The team should maximise opportunities to capture divergent views and perspectives within study findings. 35

Full-text screening

Full-text screening similarly benefits from using a pre-prepared pretested standardised template where possible 1 14 ( table 2 , item R11). If a single reviewer undertakes full-text screening, 8 the team should identify likely risks to trustworthiness of findings and focus quality control procedures (eg, use of additional reviewers and percentages for double screening) on specific threats 14 ( table 2 , item R13). The Cochrane rQES team opted for double screening to assist their immersion within the topic. 14

Data extraction

Data extraction of descriptive/contextual data may be facilitated by review management software (eg, EPPI-Reviewer) or home-made approaches using Google Forms, or other survey software. 36 Where extraction of qualitative findings requires line-by-line coding with multiple iterations of the data then a qualitative data management analysis package, such as QSR NVivo, reaps dividends. 36 The team must decide if, collectively, they favour extracting data to a template or coding direct within an electronic version of an article.

Quality control must be fit for purpose but not excessive. Published examples typically use a single reviewer for data extraction 8 with use of two independent reviewers being the exception. The team could limit data extraction to minimal essential items. They may also consider re-using descriptive details and findings previously extracted within previous well-conducted QES ( table 2 , item R14). A pre-existing framework, where readily identified, may help to structure the data extraction template. 15 37 The same framework may be used to present the findings. Some organisations may specify a preferred framework, such as an evidence-to-decision-making framework. 38

Assessment of methodological limitations

The QES community assess ‘methodological limitations’ rather than use ‘risk of bias’ terminology. An rQES team should pick an approach appropriate to their specific review. For example, a thematic map may not require assessment of individual studies—a brief statement of the generic limitations of the set of studies may be sufficient. However, for any synthesis that underpins practice recommendations 39 assessment of included studies is integral to the credibility of findings. In any decision-making context that involves recommendations or guidelines, an assessment of methodological limitations is mandatory. 40 41

Each review team should work with knowledge users to determine a review-specific approach to quality assessment. 27 While ‘traffic lights’, similar to the outputs from the Cochrane Risk of Bias tool, may facilitate rapid interpretation, accompanying textual notes are invaluable in highlighting specific areas for concern. In particular, the rQES team should demonstrate that they are aware (a) that research designs for qualitative research seek to elicit divergent views, rather than control for variation; (b) that, for qualitative research, the selection of the sample is far more informative than the size of the sample; and (c) that researchers from primary research, and equally reviewers for the qualitative synthesis, need to be thoughtful and reflexive about their possible influences on interpretation of either the primary data or the synthesised findings.

Selection of checklist

Numerous scales and checklists exist for assessing the quality of qualitative studies. In the absence of validated risk of bias tools for qualitative studies, the team should choose a tool according to Cochrane Qualitative and Implementation Methods Group (CQIMG) guidance together with expediency (according to ease of use, prior familiarity, etc) ( table 2 , item R15). 41 In comparison to the Critical Appraisal Skills Programme checklist which was never designed for use in synthesis, 42 the Cochrane qualitative tool is similarly easy to use and was designed for QES use. Work is underway to identify an assessment process that is compatible with QESs that support decision-making. 41 For now the choice of a checklist remains determined by interim Cochrane guidance and, beyond this, by personal preference and experience. For an rQES a team could use a single reviewer to assess methodological limitations, with verification of judgements (and support statements) by a second reviewer ( table 2 , item R16).

The CQIMG endorses three types of synthesis; thematic synthesis, framework synthesis and meta-ethnography ( box 1 ). 43 44 Rapid QES favour descriptive thematic synthesis 45 or framework synthesis, 46 47 except when theory generation (meta-ethnography 48 49 or analytical thematic synthesis) is a priority ( table 2 , item R17).

Choosing a method for rapid qualitative synthesis

Thematic synthesis: first choice method for rQES. 45 For example, in their rapid QES Crooks and colleagues 44 used a thematic synthesis to understand the experiences of both academic and lived experience coresearchers within palliative and end of life research. 45

Framework synthesis: alternative where a suitable framework can be speedily identified. 46 For example, Bright and colleagues 46 considered ‘best-fit framework synthesis’ as appropriate for mapping study findings to an ‘a priori framework of dimensions measured by prenatal maternal anxiety tools’ within their ‘streamlined and time-limited evidence review’. 47

Less commonly, an adapted meta-ethnographical approach was used for an implementation model of social distancing where supportive data (29 studies) was plentiful. 48 However, this QES demonstrates several features that subsequently challenge its original identification as ‘rapid’. 49

Abbrevations: QES, qualitative evidence synthesis; rQES, resource-constrained qualitative evidence synthesis.

The team should consider whether a conceptual model, theory or framework offers a rapid way for organising, coding, interpreting and presenting findings ( table 2 , item R18). If the extracted data appears rich enough to sustain further interpretation, data from a thematic or framework synthesis can subsequently be explored within a subsequent meta-ethnography. 43 However, this requires a team with substantial interpretative expertise. 11

Assessments of confidence in the evidence 4 are central to any rQES that seeks to support decision-making and the QES-specific Grading of Recommendations Assessment, Development and Evaluation approach for assessing the Confidence of Evidence from Reviews of Qualitative research (GRADE-CERQual) approach is designed to assess confidence in qualitative evidence. 50 This can be performed by a single reviewer, confirmed by a second reviewer. 26 Additional reviewers could verify all, or a sample of, assessments. For a rapid assessment a team must prioritise findings, using objective criteria; a WHO rQES focused only on the three ‘highly synthesised findings’. 20 The team could consider reusing GRADE-CERQual assessments from published QESs if findings are relevant and of demonstrable high quality ( table 2 , item R19). 50 No rapid approach to full application of GRADE-CERQual currently exists.

Reporting and record management

Little is written on optimal use of technology. 8 A rapid review is not a good time to learn review management software or qualitative analysis management software. Using such software for all general QES processes ( table 2 , item R20), and then harnessing these skills and tools when specifically under resource pressures, is a sounder strategy. Good file labelling and folder management and a ‘develop once, re-use multi-times’ approach facilitates resource savings.

Reporting requirements include the meta-ethnography reporting guidance (eMERGe) 51 and the Enhancing transparency in reporting the synthesis of qualitative research (ENTREQ) statement. 52 An rQES should describe limitations and their implications for confidence in the evidence even more thoroughly than a regular QES; detailing the consequences of fast-tracking, streamlining or of omitting processes all together. 8 Time spent documenting reflexivity is similarly important. 27 If QES methodology is to remain credible rapid approaches must be applied with insight and documented with circumspection. 53 54 (56)

Ethics statements

Patient consent for publication.

Not applicable.

Ethics approval

  • Klerings I ,
  • Robalino S ,
  • Booth A , et al
  • Nussbaumer-Streit B ,
  • Hamel C , et al
  • Garritty C ,
  • Tricco AC ,
  • Smith M , et al
  • Gartlehner G ,
  • Devane D , et al
  • NHS Scotland
  • Campbell F ,
  • Flemming K , et al
  • Glenton C ,
  • Lubarsky S ,
  • Varpio L , et al
  • Meskell P ,
  • Glenton C , et al
  • Houghton C ,
  • Delaney H , et al
  • Beecher C ,
  • Maeso B , et al
  • McKenzie JE , et al
  • Harris JL ,
  • Cargo M , et al
  • Varley-Campbell J , et al
  • Downe S , et al
  • Shamseer L ,
  • Clarke M , et al
  • Nussbaumer-Streit B , et al
  • Finlayson KW ,
  • Lawrie TA , et al
  • Lewin S , et al
  • Frandsen TF ,
  • Gildberg FA ,
  • Tingleff EB
  • Mshelia S ,
  • Analo CV , et al
  • Husk K , et al
  • Carmona C ,
  • Carroll C ,
  • Ilott I , et al
  • Meehan B , et al
  • Munthe-Kaas H ,
  • Bohren MA ,
  • Munthe-Kaas HM ,
  • French DP ,
  • Flemming K ,
  • Garside R , et al
  • Shulman C , et al
  • Dixon-Woods M
  • Bright KS ,
  • Norris JM ,
  • Letourneau NL , et al
  • Sadjadi M ,
  • Mörschel KS ,
  • Petticrew M
  • France EF ,
  • Cunningham M ,
  • Ring N , et al
  • McInnes E , et al
  • Britten N ,
  • Garside R ,
  • Pope C , et al

Supplementary materials

Supplementary data.

This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

  • Data supplement 1

Contributors All authors (AB, IS, JN, CH, FC) have made substantial contributions to the conception and design of the guidance document. AB led on drafting the work and revising it critically for important intellectual content. All other authors (IS, JN, CH, FC) contributed to revisions of the document. All authors (AB, IS, JN, CH, FC) have given final approval of the version to be published. As members of the Cochrane Qualitative and Implementation Methods Group and/or the Cochrane Rapid Reviews Methods Group all authors (AB, IS, JN, CH, FC) agree to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

Competing interests AB is co-convenor of the Cochrane Qualitative and Implementation Methods Group. In the last 36 months, he received royalties from Systematic Approaches To a Successful Literature Review (Sage 3rd edition), honoraria from the Agency for Healthcare Research and Quality, and travel support from the WHO. JN is lead convenor of the Cochrane Qualitative and Implementation Methods Group. In the last 36 months, she has received honoraria from the Agency for Healthcare Research and Quality and travel support from the WHO. CH is co-convenor of the Cochrane Qualitative and Implementation Methods Group.

Patient and public involvement Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.

Provenance and peer review Not commissioned; internally peer reviewed.

Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.

Read the full text or download the PDF:

Literature Reviews

  • Getting Started
  • Choosing a Type of Review
  • Developing a Research Question
  • Searching the Literature
  • Searching Tips
  • ChatGPT [beta]
  • Documenting your Search
  • Using Citation Managers
  • Concept Mapping
  • Concept Map Definition

MindMeister

  • Writing the Review
  • Further Resources

Additional Tools

Google slides.

GSlides can create concept maps using their Diagram feature. Insert > Diagram > Hierarchy will give you some editable templates to use.

Tutorial on diagrams in GSlides .

MICROSOFT WORD

MS Word can create concept maps using Insert > SmartArt Graphic. Select Process, Cycle, Hierarchy, or Relationship to see templates.

NVivo  is software for qualitative analysis that has a concept map feature. Zotero libraries can be uploaded using ris files. NVivo Concept Map information.

A concept map or mind map is a visual representation of knowledge that illustrates relationships between concepts or ideas. It is a tool for organizing and representing information in a hierarchical and interconnected manner. At its core, a concept map consists of nodes, which represent individual concepts or ideas, and links, which depict the relationships between these concepts .

Below is a non-exhaustive list of tools that can facilitate the creation of concept maps.

what is document review in qualitative research

www.canva.com

Canva is a user-friendly graphic design platform that enables individuals to create visual content quickly and easily. It offers a diverse array of customizable templates, design elements, and tools, making it accessible to users with varying levels of design experience. 

Pros: comes with many pre-made concept map templates to get you started

Cons : not all features are available in the free version

Explore Canva concept map templates here .

Note: Although Canva advertises an "education" option, this is for K-12 only and does not apply to university users.

what is document review in qualitative research

www.lucidchart.com

Lucid has two tools that can create mind maps (what they're called inside Lucid): Lucidchart is the place to build, document, and diagram, and Lucidspark is the place to ideate, connect, and plan.

Lucidchart is a collaborative online diagramming and visualization tool that allows users to create a wide range of diagrams, including flowcharts, org charts, wireframes, and mind maps. Its mind-mapping feature provides a structured framework for brainstorming ideas, organizing thoughts, and visualizing relationships between concepts. 

Lucidspark , works as a virtual whiteboard. Here, you can add sticky notes, develop ideas through freehand drawing, and collaborate with your teammates. Has only one template for mind mapping.

Explore Lucid mind map creation here .

How to create mind maps using LucidSpark: 

Note: U-M students have access to Lucid through ITS. [ info here ] Choose the "Login w Google" option, use your @umich.edu account, and access should happen automatically.

what is document review in qualitative research

www.figma.com

Figma is a cloud-based design tool that enables collaborative interface design and prototyping. It's widely used by UI/UX designers to create, prototype, and iterate on digital designs. Figma is the main design tool, and FigJam is their virtual whiteboard:

Figma  is a comprehensive design tool that enables designers to create and prototype high-fidelity designs

FigJam focuses on collaboration and brainstorming, providing a virtual whiteboard-like experience, best for concept maps

Explore FigJam concept maps here .

what is document review in qualitative research

Note: There is a " Figma for Education " version for students that will provide access. Choose the "Login w Google" option, use your @umich.edu account, and access should happen automatically.

what is document review in qualitative research

www.mindmeister.com

MindMeister  is an online mind mapping tool that allows users to visually organize their thoughts, ideas, and information in a structured and hierarchical format. It provides a digital canvas where users can create and manipulate nodes representing concepts or topics, and connect them with lines to show relationships and associations.

Features : collaborative, permits multiple co-authors, and multiple export formats. The free version allows up to 3 mind maps.

Explore  MindMeister templates here .

  • << Previous: Using Citation Managers
  • Next: Writing the Review >>
  • Last Updated: Feb 15, 2024 1:47 PM
  • URL: https://guides.lib.umich.edu/litreview

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List
  • HHS Author Manuscripts

Logo of nihpa

Characteristics of Qualitative Descriptive Studies: A Systematic Review

MSN, CRNP, Doctoral Candidate, University of Pennsylvania School of Nursing

Justine S. Sefcik

MS, RN, Doctoral Candidate, University of Pennsylvania School of Nursing

Christine Bradway

PhD, CRNP, FAAN, Associate Professor of Gerontological Nursing, University of Pennsylvania School of Nursing

Qualitative description (QD) is a term that is widely used to describe qualitative studies of health care and nursing-related phenomena. However, limited discussions regarding QD are found in the existing literature. In this systematic review, we identified characteristics of methods and findings reported in research articles published in 2014 whose authors identified the work as QD. After searching and screening, data were extracted from the sample of 55 QD articles and examined to characterize research objectives, design justification, theoretical/philosophical frameworks, sampling and sample size, data collection and sources, data analysis, and presentation of findings. In this review, three primary findings were identified. First, despite inconsistencies, most articles included characteristics consistent with limited, available QD definitions and descriptions. Next, flexibility or variability of methods was common and desirable for obtaining rich data and achieving understanding of a phenomenon. Finally, justification for how a QD approach was chosen and why it would be an appropriate fit for a particular study was limited in the sample and, therefore, in need of increased attention. Based on these findings, recommendations include encouragement to researchers to provide as many details as possible regarding the methods of their QD study so that readers can determine whether the methods used were reasonable and effective in producing useful findings.

Qualitative description (QD) is a label used in qualitative research for studies which are descriptive in nature, particularly for examining health care and nursing-related phenomena ( Polit & Beck, 2009 , 2014 ). QD is a widely cited research tradition and has been identified as important and appropriate for research questions focused on discovering the who, what, and where of events or experiences and gaining insights from informants regarding a poorly understood phenomenon. It is also the label of choice when a straight description of a phenomenon is desired or information is sought to develop and refine questionnaires or interventions ( Neergaard et al., 2009 ; Sullivan-Bolyai et al., 2005 ).

Despite many strengths and frequent citations of its use, limited discussions regarding QD are found in qualitative research textbooks and publications. To the best of our knowledge, only seven articles include specific guidance on how to design, implement, analyze, or report the results of a QD study ( Milne & Oberle, 2005 ; Neergaard, Olesen, Andersen, & Sondergaard, 2009 ; Sandelowski, 2000 , 2010 ; Sullivan-Bolyai, Bova, & Harper, 2005 ; Vaismoradi, Turunen, & Bondas, 2013 ; Willis, Sullivan-Bolyai, Knafl, & Zichi-Cohen, 2016 ). Furthermore, little is known about characteristics of QD as reported in journal-published, nursing-related, qualitative studies. Therefore, the purpose of this systematic review was to describe specific characteristics of methods and findings of studies reported in journal articles (published in 2014) self-labeled as QD. In this review, we did not have a goal to judge whether QD was done correctly but rather to report on the features of the methods and findings.

Features of QD

Several QD design features and techniques have been described in the literature. First, researchers generally draw from a naturalistic perspective and examine a phenomenon in its natural state ( Sandelowski, 2000 ). Second, QD has been described as less theoretical compared to other qualitative approaches ( Neergaard et al., 2009 ), facilitating flexibility in commitment to a theory or framework when designing and conducting a study ( Sandelowski, 2000 , 2010 ). For example, researchers may or may not decide to begin with a theory of the targeted phenomenon and do not need to stay committed to a theory or framework if their investigations take them down another path ( Sandelowski, 2010 ). Third, data collection strategies typically involve individual and/or focus group interviews with minimal to semi-structured interview guides ( Neergaard et al., 2009 ; Sandelowski, 2000 ). Fourth, researchers commonly employ purposeful sampling techniques such as maximum variation sampling which has been described as being useful for obtaining broad insights and rich information ( Neergaard et al., 2009 ; Sandelowski, 2000 ). Fifth, content analysis (and in many cases, supplemented by descriptive quantitative data to describe the study sample) is considered a primary strategy for data analysis ( Neergaard et al., 2009 ; Sandelowski, 2000 ). In some instances thematic analysis may also be used to analyze data; however, experts suggest care should be taken that this type of analysis is not confused with content analysis ( Vaismoradi et al., 2013 ). These data analysis approaches allow researchers to stay close to the data and as such, interpretation is of low-inference ( Neergaard et al., 2009 ), meaning that different researchers will agree more readily on the same findings even if they do not choose to present the findings in the same way ( Sandelowski, 2000 ). Finally, representation of study findings in published reports is expected to be straightforward, including comprehensive descriptive summaries and accurate details of the data collected, and presented in a way that makes sense to the reader ( Neergaard et al., 2009 ; Sandelowski, 2000 ).

It is also important to acknowledge that variations in methods or techniques may be appropriate across QD studies ( Sandelowski, 2010 ). For example, when consistent with the study goals, decisions may be made to use techniques from other qualitative traditions, such as employing a constant comparative analytic approach typically associated with grounded theory ( Sandelowski, 2000 ).

Search Strategy and Study Screening

The PubMed electronic database was searched for articles written in English and published from January 1, 2014 to December 31, 2014, using the terms, “qualitative descriptive study,” “qualitative descriptive design,” and “qualitative description,” combined with “nursing.” This specific publication year, “2014,” was chosen because it was the most recent full year at the time of beginning this systematic review. As we did not intend to identify trends in QD approaches over time, it seemed reasonable to focus on the nursing QD studies published in a certain year. The inclusion criterion for this review was data-based, nursing-related, research articles in which authors used the terms QD, qualitative descriptive study, or qualitative descriptive design in their titles or abstracts as well as in the main texts of the publication.

All articles yielded through an initial search in PubMed were exported into EndNote X7 ( Thomson Reuters, 2014 ), a reference management software, and duplicates were removed. Next, titles and abstracts were reviewed to determine if the publication met inclusion criteria; all articles meeting inclusion criteria were then read independently in full by two authors (HK and JS) to determine if the terms – QD or qualitative descriptive study/design – were clearly stated in the main texts. Any articles in which researchers did not specifically state these key terms in the main text were then excluded, even if the terms had been used in the study title or abstract. In one article, for example, although “qualitative descriptive study” was reported in the published abstract, the researchers reported a “qualitative exploratory design” in the main text of the article ( Sundqvist & Carlsson, 2014 ); therefore, this article was excluded from our review. Despite the possibility that there may be other QD studies published in 2014 that were not labeled as such, to facilitate our screening process we only included articles where the researchers clearly used our search terms for their approach. Finally, the two authors compared, discussed, and reconciled their lists of articles with a third author (CB).

Study Selection

Initially, although the year 2014 was specifically requested, 95 articles were identified (due to ahead of print/Epub) and exported into the EndNote program. Three duplicate publications were removed and the 20 articles with final publication dates of 2015 were also excluded. The remaining 72 articles were then screened by examining titles, abstracts, and full-texts. Based on our inclusion criteria, 15 (of 72) were then excluded because QD or QD design/study was not identified in the main text. We then re-examined the remaining 57 articles and excluded two additional articles that did not meet inclusion criteria (e.g., QD was only reported as an analytic approach in the data analysis section). The remaining 55 publications met inclusion criteria and comprised the sample for our systematic review (see Figure 1 ).

An external file that holds a picture, illustration, etc.
Object name is nihms832592f1.jpg

Flow Diagram of Study Selection

Of the 55 publications, 23 originated from North America (17 in the United States; 6 in Canada), 12 from Asia, 11 from Europe, 7 from Australia and New Zealand, and 2 from South America. Eleven studies were part of larger research projects and two of them were reported as part of larger mixed-methods studies. Four were described as a secondary analysis.

Quality Appraisal Process

Following the identification of the 55 publications, two authors (HK and JS) independently examined each article using the Critical Appraisal Skills Programme (CASP) qualitative checklist ( CASP, 2013 ). The CASP was chosen to determine the general adequacy (or rigor) of the qualitative studies included in this review as the CASP criteria are generic and intend to be applied to qualitative studies in general. In addition, the CASP was useful because we were able to examine the internal consistency between study aims and methods and between study aims and findings as well as the usefulness of findings ( CASP, 2013 ). The CASP consists of 10 main questions with several sub-questions to consider when making a decision about the main question ( CASP, 2013 ). The first two questions have reviewers examine the clarity of study aims and appropriateness of using qualitative research to achieve the aims. With the next eight questions, reviewers assess study design, sampling, data collection, and analysis as well as the clarity of the study’s results statement and the value of the research. We used the seven questions and 17 sub-questions related to methods and statement of findings to evaluate the articles. The results of this process are presented in Table 1 .

CASP Questions and Quality Appraisal Results (N = 55)

Note . The CASP questions are adapted from “10 questions to help you make sense of qualitative research,” by Critical Appraisal Skills Programme, 2013, retrieved from http://media.wix.com/ugd/dded87_29c5b002d99342f788c6ac670e49f274.pdf . Its license can be found at http://creativecommons.org/licenses/by-nc-sa/3.0/

Once articles were assessed by the two authors independently, all three authors discussed and reconciled our assessment. No articles were excluded based on CASP results; rather, results were used to depict the general adequacy (or rigor) of all 55 articles meeting inclusion criteria for our systematic review. In addition, the CASP was included to enhance our examination of the relationship between the methods and the usefulness of the findings documented in each of the QD articles included in this review.

Process for Data Extraction and Analysis

To further assess each of the 55 articles, data were extracted on: (a) research objectives, (b) design justification, (c) theoretical or philosophical framework, (d) sampling and sample size, (e) data collection and data sources, (f) data analysis, and (g) presentation of findings (see Table 2 ). We discussed extracted data and identified common and unique features in the articles included in our systematic review. Findings are described in detail below and in Table 3 .

Elements for Data Extraction

Data Extraction and Analysis Results

Note . NR = not reported

Quality Appraisal Results

Justification for use of a QD design was evident in close to half (47.3%) of the 55 publications. While most researchers clearly described recruitment strategies (80%) and data collection methods (100%), justification for how the study setting was selected was only identified in 38.2% of the articles and almost 75% of the articles did not include any reason for the choice of data collection methods (e.g., focus-group interviews). In the vast majority (90.9%) of the articles, researchers did not explain their involvement and positionality during the process of recruitment and data collection or during data analysis (63.6%). Ethical standards were reported in greater than 89% of all articles and most articles included an in-depth description of data analysis (83.6%) and development of categories or themes (92.7%). Finally, all researchers clearly stated their findings in relation to research questions/objectives. Researchers of 83.3% of the articles discussed the credibility of their findings (see Table 1 ).

Research Objectives

In statements of study objectives and/or questions, the most frequently used verbs were “explore” ( n = 22) and “describe” ( n = 17). Researchers also used “identify” ( n = 3), “understand” ( n = 4), or “investigate” ( n = 2). Most articles focused on participants’ experiences related to certain phenomena ( n = 18), facilitators/challenges/factors/reasons ( n = 14), perceptions about specific care/nursing practice/interventions ( n = 11), and knowledge/attitudes/beliefs ( n = 3).

Design Justification

A total of 30 articles included references for QD. The most frequently cited references ( n = 23) were “Whatever happened to qualitative description?” ( Sandelowski, 2000 ) and “What’s in a name? Qualitative description revisited” ( Sandelowski, 2010 ). Other references cited included “Qualitative description – the poor cousin of health research?” ( Neergaard et al., 2009 ), “Reaching the parts other methods cannot reach: an introduction to qualitative methods in health and health services research” ( Pope & Mays, 1995 ), and general research textbooks ( Polit & Beck, 2004 , 2012 ).

In 26 articles (and not necessarily the same as those citing specific references to QD), researchers provided a rationale for selecting QD. Most researchers chose QD because this approach aims to produce a straight description and comprehensive summary of the phenomenon of interest using participants’ language and staying close to the data (or using low inference).

Authors of two articles distinctly stated a QD design, yet also acknowledged grounded-theory or phenomenological overtones by adopting some techniques from these qualitative traditions ( Michael, O'Callaghan, Baird, Hiscock, & Clayton, 2014 ; Peacock, Hammond-Collins, & Forbes, 2014 ). For example, Michael et al. (2014 , p. 1066) reported:

The research used a qualitative descriptive design with grounded theory overtones ( Sandelowski, 2000 ). We sought to provide a comprehensive summary of participants’ views through theoretical sampling; multiple data sources (focus groups [FGs] and interviews); inductive, cyclic, and constant comparative analysis; and condensation of data into thematic representations ( Corbin & Strauss, 1990 , 2008 ).

Authors of four additional articles included language suggestive of a grounded-theory or phenomenological tradition, e.g., by employing a constant comparison technique or translating themes stated in participants’ language into the primary language of the researchers during data analysis ( Asemani et al., 2014 ; Li, Lee, Chen, Jeng, & Chen, 2014 ; Ma, 2014 ; Soule, 2014 ). Additionally, Li et al. (2014) specifically reported use of a grounded-theory approach.

Theoretical or Philosophical Framework

In most (n = 48) articles, researchers did not specify any theoretical or philosophical framework. Of those articles in which a framework or philosophical stance was included, the authors of five articles described the framework as guiding the development of an interview guide ( Al-Zadjali, Keller, Larkey, & Evans, 2014 ; DeBruyn, Ochoa-Marin, & Semenic, 2014 ; Fantasia, Sutherland, Fontenot, & Ierardi, 2014 ; Ma, 2014 ; Wiens, Babenko-Mould, & Iwasiw, 2014 ). In two articles, data analysis was described as including key concepts of a framework being used as pre-determined codes or categories ( Al-Zadjali et al., 2014 ; Wiens et al., 2014 ). Oosterveld-Vlug et al. (2014) and Zhang, Shan, and Jiang (2014) discussed a conceptual model and underlying philosophy in detail in the background or discussion section, although the model and philosophy were not described as being used in developing interview questions or analyzing data.

Sampling and Sample Size

In 38 of the 55 articles, researchers reported ‘purposeful sampling’ or some derivation of purposeful sampling such as convenience ( n = 10), maximum variation ( n = 8), snowball ( n = 3), and theoretical sampling ( n = 1). In three instances ( Asemani et al., 2014 ; Chan & Lopez, 2014 ; Soule, 2014 ), multiple sampling strategies were described, for example, a combination of snowball, convenience, and maximum variation sampling. In articles where maximum variation sampling was employed, “variation” referred to seeking diversity in participants’ demographics ( n = 7; e.g., age, gender, and education level), while one article did not include details regarding how their maximum variation sampling strategy was operationalized ( Marcinowicz, Abramowicz, Zarzycka, Abramowicz, & Konstantynowicz, 2014 ). Authors of 17 articles did not specify their sampling techniques.

Sample sizes ranged from 8 to 1,932 with nine studies in the 8–10 participant range and 24 studies in the 11–20 participant range. The participant range of 21–30 and 31–50 was reported in eight articles each. Six studies included more than 50 participants. Two of these articles depicted quite large sample sizes (N=253, Hart & Mareno, 2014 ; N=1,932, Lyndon et al., 2014 ) and the authors of these articles described the use of survey instruments and analysis of responses to open-ended questions. This was in contrast to studies with smaller sample sizes where individual interviews and focus groups were more commonly employed.

Data Collection and Data Sources

In a majority of studies, researchers collected data through individual ( n = 39) and/or focus-group ( n = 14) interviews that were semistructured. Most researchers reported that interviews were audiotaped ( n = 51) and interview guides were described as the primary data collection tool in 29 of the 51 studies. In some cases, researchers also described additional data sources, for example, taking memos or field notes during participant observation sessions or as a way to reflect their thoughts about interviews ( n = 10). Written responses to open-ended questions in survey questionnaires were another type of data source in a small number of studies ( n = 4).

Data Analysis

The analysis strategy most commonly used in the QD studies included in this review was qualitative content analysis ( n = 30). Among the studies where this technique was used, most researchers described an inductive approach; researchers of two studies analyzed data both inductively and deductively. Thematic analysis was adopted in 14 studies and the constant comparison technique in 10 studies. In nine studies, researchers employed multiple techniques to analyze data including qualitative content analysis with constant comparison ( Asemani et al., 2014 ; DeBruyn et al., 2014 ; Holland, Christensen, Shone, Kearney, & Kitzman, 2014 ; Li et al., 2014 ) and thematic analysis with constant comparison ( Johansson, Hildingsson, & Fenwick, 2014 ; Oosterveld-Vlug et al., 2014 ). In addition, five teams conducted descriptive statistical analysis using both quantitative and qualitative data and counting the frequencies of codes/themes ( Ewens, Chapman, Tulloch, & Hendricks, 2014 ; Miller, 2014 ; Santos, Sandelowski, & Gualda, 2014 ; Villar, Celdran, Faba, & Serrat, 2014 ) or targeted events through video monitoring ( Martorella, Boitor, Michaud, & Gelinas, 2014 ). Tseng, Chen, and Wang (2014) cited Thorne, Reimer Kirkham, and O’Flynn-Magee (2004)’s interpretive description as the inductive analytic approach. In five out of 55 articles, researchers did not specifically name their analysis strategies, despite including descriptions about procedural aspects of data analysis. Researchers of 20 studies reported that data saturation for their themes was achieved.

Presentation of Findings

Researchers described participants’ experiences of health care, interventions, or illnesses in 18 articles and presented straightforward, focused, detailed descriptions of facilitators, challenges, factors, reasons, and causes in 15 articles. Participants’ perceptions of specific care, interventions, or programs were described in detail in 11 articles. All researchers presented their findings with extensive descriptions including themes or categories. In 25 of 55 articles, figures or tables were also presented to illustrate or summarize the findings. In addition, the authors of three articles summarized, organized, and described their data using key concepts of conceptual models ( Al-Zadjali et al., 2014 ; Oosterveld-Vlug et al., 2014 ; Wiens et al., 2014 ). Martorella et al. (2014) assessed acceptability and feasibility of hand massage therapy and arranged their findings in relation to pre-determined indicators of acceptability and feasibility. In one longitudinal QD study ( Kneck, Fagerberg, Eriksson, & Lundman, 2014 ), the researchers presented the findings as several key patterns of learning for persons living with diabetes; in another longitudinal QD study ( Stegenga & Macpherson, 2014 ), findings were presented as processes and themes regarding patients’ identity work across the cancer trajectory. In another two studies, the researchers described and compared themes or categories from two different perspectives, such as patients and nurses ( Canzan, Heilemann, Saiani, Mortari, & Ambrosi, 2014 ) or parents and children ( Marcinowicz et al., 2014 ). Additionally, Ma (2014) reported themes using both participants’ language and the researcher’s language.

In this systematic review, we examined and reported specific characteristics of methods and findings reported in journal articles self-identified as QD and published during one calendar year. To accomplish this we identified 55 articles that met inclusion criteria, performed a quality appraisal following CASP guidelines, and extracted and analyzed data focusing on QD features. In general, three primary findings emerged. First, despite inconsistencies, most QD publications had the characteristics that were originally observed by Sandelowski (2000) and summarized by other limited available QD literature. Next, there are no clear boundaries in methods used in the QD studies included in this review; in a number of studies, researchers adopted and combined techniques originating from other qualitative traditions to obtain rich data and increase their understanding of the phenomenon under investigation. Finally, justification for how QD was chosen and why it would be an appropriate fit for a particular study is an area in need of increased attention.

In general, the overall characteristics were consistent with design features of QD studies described in the literature ( Neergaard et al., 2009 ; Sandelowski, 2000 , 2010 ; Vaismoradi et al., 2013 ). For example, many authors reported that study objectives were to describe or explore participants’ experiences and factors related to certain phenomena, events, or interventions. In most cases, these authors cited Sandelowski (2000) as a reference for this particular characteristic. It was rare that theoretical or philosophical frameworks were identified, which also is consistent with descriptions of QD. In most studies, researchers used purposeful sampling and its derivative sampling techniques, collected data through interviews, and analyzed data using qualitative content analysis or thematic analysis. Moreover, all researchers presented focused or comprehensive, descriptive summaries of data including themes or categories answering their research questions. These characteristics do not indicate that there are correct ways to do QD studies; rather, they demonstrate how others designed and produced QD studies.

In several studies, researchers combined techniques that originated from other qualitative traditions for sampling, data collection, and analysis. This flexibility or variability, a key feature of recently published QD studies, may indicate that there are no clear boundaries in designing QD studies. Sandelowski (2010) articulated: “in the actual world of research practice, methods bleed into each other; they are so much messier than textbook depictions” (p. 81). Hammersley (2007) also observed:

“We are not so much faced with a set of clearly differentiated qualitative approaches as with a complex landscape of variable practice in which the inhabitants use a range of labels (‘ethnography’, ‘discourse analysis’, ‘life history work’, narrative study’, ……, and so on) in diverse and open-ended ways in order to characterize their orientation, and probably do this somewhat differently across audiences and occasions” (p. 293).

This concept of having no clear boundaries in methods when designing a QD study should enable researchers to obtain rich data and produce a comprehensive summary of data through various data collection and analysis approaches to answer their research questions. For example, using an ethnographical approach (e.g., participant observation) in data collection for a QD study may facilitate an in-depth description of participants’ nonverbal expressions and interactions with others and their environment as well as situations or events in which researchers are interested ( Kawulich, 2005 ). One example found in our review is that Adams et al. (2014) explored family members’ responses to nursing communication strategies for patients in intensive care units (ICUs). In this study, researchers conducted interviews with family members, observed interactions between healthcare providers, patients, and family members in ICUs, attended ICU rounds and family meetings, and took field notes about their observations and reflections. Accordingly, the variability in methods provided Adams and colleagues (2014) with many different aspects of data that were then used to complement participants’ interviews (i.e., data triangulation). Moreover, by using a constant comparison technique in addition to qualitative content analysis or thematic analysis in QD studies, researchers compare each case with others looking for similarities and differences as well as reasoning why differences exist, to generate more general understanding of phenomena of interest ( Thorne, 2000 ). In fact, this constant comparison analysis is compatible with qualitative content analysis and thematic analysis and we found several examples of using this approach in studies we reviewed ( Asemani et al., 2014 ; DeBruyn et al., 2014 ; Holland et al., 2014 ; Johansson et al., 2014 ; Li et al., 2014 ; Oosterveld-Vlug et al., 2014 ).

However, this flexibility or variability in methods of QD studies may cause readers’ as well as researchers’ confusion in designing and often labeling qualitative studies ( Neergaard et al., 2009 ). Especially, it could be difficult for scholars unfamiliar with qualitative studies to differentiate QD studies with “hues, tones, and textures” of qualitative traditions ( Sandelowski, 2000 , p. 337) from grounded theory, phenomenological, and ethnographical research. In fact, the major difference is in the presentation of the findings (or outcomes of qualitative research) ( Neergaard et al., 2009 ; Sandelowski, 2000 ). The final products of grounded theory, phenomenological, and ethnographical research are a generation of a theory, a description of the meaning or essence of people’s lived experience, and an in-depth, narrative description about certain culture, respectively, through researchers’ intensive/deep interpretations, reflections, and/or transformation of data ( Streubert & Carpenter, 2011 ). In contrast, QD studies result in “a rich, straight description” of experiences, perceptions, or events using language from the collected data ( Neergaard et al., 2009 ) through low-inference (or data-near) interpretations during data analysis ( Sandelowski, 2000 , 2010 ). This feature is consistent with our finding regarding presentation of findings: in all QD articles included in this systematic review, the researchers presented focused or comprehensive, descriptive summaries to their research questions.

Finally, an explanation or justification of why a QD approach was chosen or appropriate for the study aims was not found in more than half of studies in the sample. While other qualitative approaches, including grounded theory, phenomenology, ethnography, and narrative analysis, are used to better understand people’s thoughts, behaviors, and situations regarding certain phenomena ( Sullivan-Bolyai et al., 2005 ), as noted above, the results will likely read differently than those for a QD study ( Carter & Little, 2007 ). Therefore, it is important that researchers accurately label and justify their choices of approach, particularly for studies focused on participants’ experiences, which could be addressed with other qualitative traditions. Justifying one’s research epistemology, methodology, and methods allows readers to evaluate these choices for internal consistency, provides context to assist in understanding the findings, and contributes to the transparency of choices, all of which enhance the rigor of the study ( Carter & Little, 2007 ; Wu, Thompson, Aroian, McQuaid, & Deatrick, 2016 ).

Use of the CASP tool drew our attention to the credibility and usefulness of the findings of the QD studies included in this review. Although justification for study design and methods was lacking in many articles, most authors reported techniques of recruitment, data collection, and analysis that appeared. Internal consistencies among study objectives, methods, and findings were achieved in most studies, increasing readers’ confidence that the findings of these studies are credible and useful in understanding under-explored phenomenon of interest.

In summary, our findings support the notion that many scholars employ QD and include a variety of commonly observed characteristics in their study design and subsequent publications. Based on our review, we found that QD as a scholarly approach allows flexibility as research questions and study findings emerge. We encourage authors to provide as many details as possible regarding how QD was chosen for a particular study as well as details regarding methods to facilitate readers’ understanding and evaluation of the study design and rigor. We acknowledge the challenge of strict word limitation with submissions to print journals; potential solutions include collaboration with journal editors and staff to consider creative use of charts or tables, or using more citations and less text in background sections so that methods sections are robust.

Limitations

Several limitations of this review deserve mention. First, only articles where researchers explicitly stated in the main body of the article that a QD design was employed were included. In contrast, articles labeled as QD in only the title or abstract, or without their research design named were not examined due to the lack of certainty that the researchers actually carried out a QD study. As a result, we may have excluded some studies where a QD design was followed. Second, only one database was searched and therefore we did not identify or describe potential studies following a QD approach that were published in non-PubMed databases. Third, our review is limited by reliance on what was included in the published version of a study. In some cases, this may have been a result of word limits or specific styles imposed by journals, or inconsistent reporting preferences of authors and may have limited our ability to appraise the general adequacy with the CASP tool and examine specific characteristics of these studies.

Conclusions

A systematic review was conducted by examining QD research articles focused on nursing-related phenomena and published in one calendar year. Current patterns include some characteristics of QD studies consistent with the previous observations described in the literature, a focus on the flexibility or variability of methods in QD studies, and a need for increased explanations of why QD was an appropriate label for a particular study. Based on these findings, recommendations include encouragement to authors to provide as many details as possible regarding the methods of their QD study. In this way, readers can thoroughly consider and examine if the methods used were effective and reasonable in producing credible and useful findings.

Acknowledgments

This work was supported in part by the John A. Hartford Foundation’s National Hartford Centers of Gerontological Nursing Excellence Award Program.

Hyejin Kim is a Ruth L. Kirschstein NRSA Predoctoral Fellow (F31NR015702) and 2013–2015 National Hartford Centers of Gerontological Nursing Excellence Patricia G. Archbold Scholar. Justine Sefcik is a Ruth L. Kirschstein Predoctoral Fellow (F31NR015693) through the National Institutes of Health, National Institute of Nursing Research.

Conflict of Interest Statement

The Authors declare that there is no conflict of interest.

Contributor Information

Hyejin Kim, MSN, CRNP, Doctoral Candidate, University of Pennsylvania School of Nursing.

Justine S. Sefcik, MS, RN, Doctoral Candidate, University of Pennsylvania School of Nursing.

Christine Bradway, PhD, CRNP, FAAN, Associate Professor of Gerontological Nursing, University of Pennsylvania School of Nursing.

  • Adams JA, Anderson RA, Docherty SL, Tulsky JA, Steinhauser KE, Bailey DE., Jr Nursing strategies to support family members of ICU patients at high risk of dying. Heart & Lung. 2014; 43 (5):406–415. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Ahlin J, Ericson-Lidman E, Norberg A, Strandberg G. Care providers' experiences of guidelines in daily work at a municipal residential care facility for older people. Scandinavian Journal of Caring Sciences. 2014; 28 (2):355–363. [ PubMed ] [ Google Scholar ]
  • Al-Zadjali M, Keller C, Larkey L, Evans B. GCC women: causes and processes of midlife weight gain. Health Care for Women International. 2014; 35 (11–12):1267–1286. [ PubMed ] [ Google Scholar ]
  • Asemani O, Iman MT, Moattari M, Tabei SZ, Sharif F, Khayyer M. An exploratory study on the elements that might affect medical students' and residents' responsibility during clinical training. Journal of Medical Ethics and History of Medicine. 2014; 7 :8. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Atefi N, Abdullah KL, Wong LP, Mazlom R. Factors influencing registered nurses perception of their overall job satisfaction: a qualitative study. International Nursing Review. 2014; 61 (3):352–360. [ PubMed ] [ Google Scholar ]
  • Ballangrud R, Hall-Lord ML, Persenius M, Hedelin B. Intensive care nurses' perceptions of simulation-based team training for building patient safety in intensive care: a descriptive qualitative study. Intensive and Critical Care Nursing. 2014; 30 (4):179–187. [ PubMed ] [ Google Scholar ]
  • Benavides-Vaello S, Katz JR, Peterson JC, Allen CB, Paul R, Charette-Bluff AL, Morris P. Nursing and health sciences workforce diversity research using. PhotoVoice: a college and high school student participatory project. Journal of Nursing Education. 2014; 53 (4):217–222. [ PubMed ] [ Google Scholar ]
  • Bernhard C, Zielinski R, Ackerson K, English J. Home birth after hospital birth: women's choices and reflections. Journal of Midwifery and Women's Health. 2014; 59 (2):160–166. [ PubMed ] [ Google Scholar ]
  • Borbasi S, Jackson D, Langford RW. Navigating the maze of nursing research: An interactive learning adventure. 2nd. New South Wales, Australia: Mosby/Elsevier; 2008. [ Google Scholar ]
  • Bradford B, Maude R. Fetal response to maternal hunger and satiation - novel finding from a qualitative descriptive study of maternal perception of fetal movements. BMC Pregnancy and Childbirth. 2014; 14 :288. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Burns N, Grove SK. The practice of nursing research: Conduct, critique, & utilization. 5th. Philadelphia, PA: Elsevier/Saunders; 2005. [ Google Scholar ]
  • Canzan F, Heilemann MV, Saiani L, Mortari L, Ambrosi E. Visible and invisible caring in nursing from the perspectives of patients and nurses in the gerontological context. Scandinavian Journal of Caring Sciences. 2014; 28 (4):732–740. [ PubMed ] [ Google Scholar ]
  • Carter SM, Littler M. Justifying knowledge, justifying methods, taking action: Epistemologies, methodologies, and methods in qualitative research. Qualitative Health Research. 2007; 17 (10):1316–1328. [ PubMed ] [ Google Scholar ]
  • Critical Appraisal Skills Programme (CASP 2013) 10 questions to help you make sense of qualitative research. Oxford: CASP; 2013. Retrieved from http://media.wix.com/ugd/dded87_29c5b002d99342f788c6ac670e49f274.pdf . [ Google Scholar ]
  • Chan CW, Lopez V. A qualitative descriptive study of risk reduction for coronary disease among the Hong Kong Chinese. Public Health Nursing. 2014; 31 (4):327–335. [ PubMed ] [ Google Scholar ]
  • Chen YJ, Tsai YF, Lee SH, Lee HL. Protective factors against suicide among young-old Chinese outpatients. BMC Public Health. 2014; 14 :372. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Cleveland LM, Bonugli R. Experiences of mothers of infants with neonatal abstinence syndrome in the neonatal intensive care unit. Journal of Obstetric Gynecologic, & Neonatal Nursing. 2014; 43 (3):318–329. [ PubMed ] [ Google Scholar ]
  • Corbin J, Strauss A. Basics of qualitative research: Techniques and procedures for developing grounded theory. 3rd. Thousand Oaks, CA: Sage Publications; 2008. [ Google Scholar ]
  • Corbin JM, Strauss A. Grounded theory research: Procedures, canons and evaluation criteria. Qualitative Sociology. 1990; 13 (1):3–21. [ Google Scholar ]
  • DeBruyn RR, Ochoa-Marin SC, Semenic S. Barriers and facilitators to evidence-based nursing in Colombia: perspectives of nurse educators, nurse researchers and graduate students. Investigación y Educación en Enfermería. 2014; 32 (1):9–21. [ PubMed ] [ Google Scholar ]
  • Denzin NK, Lincoln YS. The Discipline and practice of qualitative research. In: Denzin NK, Lincoln YS, editors. Handbook of qualitative research. 2nd. Thousand Oaks, CA: Sage Publications; 2000. pp. 1–28. [ Google Scholar ]
  • Ewens B, Chapman R, Tulloch A, Hendricks JM. ICU survivors' utilisation of diaries post discharge: a qualitative descriptive study. Australian Critical Care. 2014; 27 (1):28–35. [ PubMed ] [ Google Scholar ]
  • Fantasia HC, Sutherland MA, Fontenot H, Ierardi JA. Knowledge, attitudes and beliefs about contraceptive and sexual consent negotiation among college women. Journal of Forensic Nursing. 2014; 10 (4):199–207. [ PubMed ] [ Google Scholar ]
  • Friman A, Wahlberg AC, Mattiasson AC, Ebbeskog B. District nurses' knowledge development in wound management: ongoing learning without organizational support. Primary Health Care Research & Development. 2014; 15 (4):386–395. [ PubMed ] [ Google Scholar ]
  • Gaughan V, Logan D, Sethna N, Mott S. Parents' perspective of their journey caring for a child with chronic neuropathic pain. Pain Management Nursing. 2014; 15 (1):246–257. [ PubMed ] [ Google Scholar ]
  • Hammersley M. The issue of quality in qualitative research. International Journal of Research & Method in Education. 2007; 30 (3):287–305. [ Google Scholar ]
  • Hart PL, Mareno N. Cultural challenges and barriers through the voices of nurses. Journal of Clinical Nursing. 2014; 23 (15–16):2223–2232. [ PubMed ] [ Google Scholar ]
  • Hasman K, Kjaergaard H, Esbensen BA. Fathers' experience of childbirth when non-progressive labour occurs and augmentation is established. A qualitative study. Sexual & Reproductive HealthCare. 2014; 5 (2):69–73. [ PubMed ] [ Google Scholar ]
  • Higgins I, van der Riet P, Sneesby L, Good P. Nutrition and hydration in dying patients: the perceptions of acute care nurses. Journal of Clinical Nursing. 2014; 23 (17–18):2609–2617. [ PubMed ] [ Google Scholar ]
  • Holland ML, Christensen JJ, Shone LP, Kearney MH, Kitzman HJ. Women's reasons for attrition from a nurse home visiting program. Journal of Obstetric, Gynecologic, & Neonatal Nursing. 2014; 43 (1):61–70. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Johansson M, Hildingsson I, Fenwick J. 'As long as they are safe--birth mode does not matter' Swedish fathers' experiences of decision-making around caesarean section. Women and Birth. 2014; 27 (3):208–213. [ PubMed ] [ Google Scholar ]
  • Kao MH, Tsai YF. Illness experiences in middle-aged adults with early-stage knee osteoarthritis: findings from a qualitative study. Journal of Advanced Nursing. 2014; 70 (7):1564–1572. [ PubMed ] [ Google Scholar ]
  • Kawulich BB. Participant observation as a data collection method. Forum: Qualitative Social Research. 2005; 6 (2) Art. 43. Retrieved from http://www.qualitative-research.net/index.php/fqs/article/view/466/997 . [ Google Scholar ]
  • Kerr D, McKay K, Klim S, Kelly AM, McCann T. Attitudes of emergency department patients about handover at the bedside. Journal of Clinical Nursing. 2014; 23 (11–12):1685–1693. [ PubMed ] [ Google Scholar ]
  • Kneck A, Fagerberg I, Eriksson LE, Lundman B. Living with diabetes - development of learning patterns over a 3-year period. International Journal of Qualitative Studies on Health and Well-being. 2014; 9 :24375. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Krippendorf K. Content analysis: An introduction to its methodology. 2nd. Thousand Oaks, CA: Sage Publications; 2004. [ Google Scholar ]
  • Larocque N, Schotsman C, Kaasalainen S, Crawshaw D, McAiney C, Brazil E. Using a book chat to improve attitudes and perceptions of long-term care staff about dementia. Journal of Gerontological Nursing. 2014; 40 (5):46–52. [ PubMed ] [ Google Scholar ]
  • Li IC, Lee SY, Chen CY, Jeng YQ, Chen YC. Facilitators and barriers to effective smoking cessation: counselling services for inpatients from nurse-counsellors' perspectives--a qualitative study. International Journal of Environmental Research and Public Health. 2014; 11 (5):4782–4798. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Lux KM, Hutcheson JB, Peden AR. Ending disruptive behavior: staff nurse recommendations to nurse educators. Nurse Education in Practice. 2014; 14 (1):37–42. [ PubMed ] [ Google Scholar ]
  • Lyndon A, Zlatnik MG, Maxfield DG, Lewis A, McMillan C, Kennedy HP. Contributions of clinical disconnections and unresolved conflict to failures in intrapartum safety. Journal of Obstetric, Gynecologic, & Neonatal Nursing. 2014; 43 (1):2–12. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Ma F, Li J, Liang H, Bai Y, Song J. Baccalaureate nursing students' perspectives on learning about caring in China: a qualitative descriptive study. BMC Medical Education. 2014; 14 :42. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Ma L. A humanbecoming qualitative descriptive study on quality of life with older adults. Nursing Science Quarterly. 2014; 27 (2):132–141. [ PubMed ] [ Google Scholar ]
  • Marcinowicz L, Abramowicz P, Zarzycka D, Abramowicz M, Konstantynowicz J. How hospitalized children and parents perceive nurses and hospital amenities: A qualitative descriptive study in Poland. Journal of Child Health Care. 2014 [ PubMed ] [ Google Scholar ]
  • Martorella G, Boitor M, Michaud C, Gelinas C. Feasibility and acceptability of hand massage therapy for pain management of postoperative cardiac surgery patients in the intensive care unit. Heart & Lung. 2014; 43 (5):437–444. [ PubMed ] [ Google Scholar ]
  • McDonough A, Callans KM, Carroll DL. Understanding the challenges during transitions of care for children with critical airway conditions. ORL Head and Neck Nursing. 2014; 32 (4):12–17. [ PubMed ] [ Google Scholar ]
  • McGilton KS, Boscart VM, Brown M, Bowers B. Making tradeoffs between the reasons to leave and reasons to stay employed in long-term care homes: perspectives of licensed nursing staff. International Journal of Nursing Studies. 2014; 51 (6):917–926. [ PubMed ] [ Google Scholar ]
  • Michael N, O'Callaghan C, Baird A, Hiscock N, Clayton J. Cancer caregivers advocate a patient- and family-centered approach to advance care planning. Journal of Pain and Symptom Management. 2014; 47 (6):1064–1077. [ PubMed ] [ Google Scholar ]
  • Miller WR. Patient-centered outcomes in older adults with epilepsy. Seizure. 2014; 23 (8):592–597. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Milne J, Oberle K. Enhancing rigor in qualitative description: a case study. Journal of Wound Ostomy & Continence Nursing. 2005; 32 (6):413–420. [ PubMed ] [ Google Scholar ]
  • Neergaard MA, Olesen F, Andersen RS, Sondergaard J. Qualitative description - the poor cousin of health research? BMC Medical Research Methodology. 2009; 9 :52. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • O'Shea MF. Staff nurses' perceptions regarding palliative care for hospitalized older adults. The American Journal of Nursing. 2014; 114 (11):26–34. [ PubMed ] [ Google Scholar ]
  • Oosterveld-Vlug MG, Pasman HR, van Gennip IE, Muller MT, Willems DL, Onwuteaka-Philipsen BD. Dignity and the factors that influence it according to nursing home residents: a qualitative interview study. Journal of Advanced Nursing. 2014; 70 (1):97–106. [ PubMed ] [ Google Scholar ]
  • Oruche UM, Draucker C, Alkhattab H, Knopf A, Mazurcyk J. Interventions for family members of adolescents with disruptive behavior disorders. Journal of Child and Adolescent Psychiatric Nursing. 2014; 27 (3):99–108. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Parse RR. Qualitative inquiry: The path of sciencing. Sudbury, MA: Jones and Barlett; 2001. [ Google Scholar ]
  • Peacock SC, Hammond-Collins K, Forbes DA. The journey with dementia from the perspective of bereaved family caregivers: a qualitative descriptive study. BMC Nursing. 2014; 13 (1):42. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Peterson WE, Sprague AE, Reszel J, Walker M, Fell DB, Perkins SL, Johnson M. Women's perspectives of the fetal fibronectin testing process: a qualitative descriptive study. BMC Pregnancy and Childbirth. 2014; 14 :190. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Polit DF, Beck CT. Nursing research: principles and methods. 7. Philadelphia, PA: Lippincott Williams & Wilkins; 2004. [ Google Scholar ]
  • Polit DF, Beck CT. International differences in nursing research, 2005–2006. Journal of Nursing Scholarship. 2009; 41 (1):44–53. [ PubMed ] [ Google Scholar ]
  • Polit DF, Beck CT. Nursing research: generating and assessing evidence for nursing practice. 9. Philadelphia, PA: Wolters Kluwer Health/Lippincott Williams & Wilkins; 2012. [ Google Scholar ]
  • Polit DF, Beck CT. Essentials of Nursing Research: Appraising Evidence for Nursing Practice. 8. Philadelphia, PA: Wolters Kluwer Health; Lippincott Willians & Wilkins; 2014. Supplement for Chapter 14: Qualitative Descriptive Studies. Retrieved from http://downloads.lww.com/wolterskluwer_vitalstream_com/sample-content/9781451176797_Polit/samples/CS_Chapter_14.pdf . [ Google Scholar ]
  • Pope C, Mays N. Qualitative research in health care. 3rd. Victoria, Australia: Blackwell Publishing; 2006. [ Google Scholar ]
  • Pope C, Mays N. Reaching the parts other methods cannot reach: an introduction to qualitative methods in health and health services research. BMJ. 1995; 311 (6996):42–45. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Raphael D, Waterworth S, Gott M. The role of practice nurses in providing palliative and end-of-life care to older patients with long-term conditions. International Journal of Palliative Nursing. 2014; 20 (8):373–379. [ PubMed ] [ Google Scholar ]
  • Saldana J. Longitudinal qualitative research: Analyzing change through time. Walnut Creek, CA: AltaMira Press; 2003. [ Google Scholar ]
  • Sandelowski M. Whatever happened to qualitative description? Research in Nursing & Health. 2000; 23 (4):334–340. [ PubMed ] [ Google Scholar ]
  • Sandelowski M. What's in a name? Qualitative description revisited. Research in Nursing & Health. 2010; 33 (1):77–84. [ PubMed ] [ Google Scholar ]
  • Santos HP, Jr, Sandelowski M, Gualda DM. Bad thoughts: Brazilian women's responses to mothering while experiencing postnatal depression. Midwifery. 2014; 30 (6):788–794. [ PubMed ] [ Google Scholar ]
  • Sharp R, Grech C, Fielder A, Mikocka-Walus A, Cummings M, Esterman A. The patient experience of a peripherally inserted central catheter (PICC): A qualitative descriptive study. Contemporary Nurse. 2014; 48 (1):26–35. [ PubMed ] [ Google Scholar ]
  • Soule I. Cultural competence in health care: an emerging theory. ANS Advances in Nursing Science. 2014; 37 (1):48–60. [ PubMed ] [ Google Scholar ]
  • Stegenga K, Macpherson CF. "I'm a survivor, go study that word and you'll see my name": adolescent and cancer identity work over the first year after diagnosis. Cancer Nursing. 2014; 37 (6):418–428. [ PubMed ] [ Google Scholar ]
  • Streubert HJ, Carpenter DR. Qualitative research in nursing: Advancing the humanistic imperative. 5th. Philadelphia, PA: Lippincott Williams & Wilkins; 2011. [ Google Scholar ]
  • Sturesson A, Ziegert K. Prepare the patient for future challenges when facing hemodialysis: nurses' experiences. International Journal of Qualitative Studies on Health and Well-being. 2014; 9 :22952. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Sullivan-Bolyai S, Bova C, Harper D. Developing and refining interventions in persons with health disparities: the use of qualitative description. Nursing Outlook. 2005; 53 (3):127–133. [ PubMed ] [ Google Scholar ]
  • Sundqvist AS, Carlsson AA. Holding the patient's life in my hands: Swedish registered nurse anaesthetists' perspective of advocacy. Scandinavian Journal of Caring Sciences. 2014; 28 (2):281–288. [ PubMed ] [ Google Scholar ]
  • Thomson Reuters. EndNote X7. 2014 Retrieved from http://endnote.com/product-details/x7 .
  • Thorne S. Data analysis in qualitative research. Evidence Based Nursing. 2000; 3 :68–70. [ Google Scholar ]
  • Thorne S, Reimer Kirkham S, O’Flynn-Magee K. The analytic challenge in interpretive description. International Journal of Qualitative Methods. 2004; 3 (1):1–11. [ Google Scholar ]
  • Tseng YF, Chen CH, Wang HH. Taiwanese women's process of recovery from stillbirth: a qualitative descriptive study. Research in Nursing & Health. 2014; 37 (3):219–228. [ PubMed ] [ Google Scholar ]
  • Vaismoradi M, Jordan S, Turunen H, Bondas T. Nursing students' perspectives of the cause of medication errors. Nurse Education Today. 2014; 34 (3):434–440. [ PubMed ] [ Google Scholar ]
  • Vaismoradi M, Turunen H, Bondas T. Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study. Nursing & Health Sciences. 2013; 15 (3):398–405. [ PubMed ] [ Google Scholar ]
  • Valizadeh L, Zamanzadeh V, Fooladi MM, Azadi A, Negarandeh R, Monadi M. The image of nursing, as perceived by Iranian male nurses. Nursing & Health Sciences. 2014; 16 (3):307–313. [ PubMed ] [ Google Scholar ]
  • Villar F, Celdran M, Faba J, Serrat R. Barriers to sexual expression in residential aged care facilities (RACFs): comparison of staff and residents' views. Journal of Advanced Nursing. 2014; 70 (11):2518–2527. [ PubMed ] [ Google Scholar ]
  • Wiens S, Babenko-Mould Y, Iwasiw C. Clinical instructors' perceptions of structural and psychological empowerment in academic nursing environments. Journal of Nursing Education. 2014; 53 (5):265–270. [ PubMed ] [ Google Scholar ]
  • Willis DG, Sullivan-Bolyai S, Knafl K, Zichi-Cohen M. Distinguishing Features and Similarities Between Descriptive Phenomenological and Qualitative Description Research. West J Nurs Res. 2016 [ PubMed ] [ Google Scholar ]
  • Wu YP, Thompson D, Aroian KJ, McQuaid EL, Deatrick JA. Commentary: Writing and Evaluating Qualitative Research Reports. J Pediatr Psychol. 2016; 41 (5):493–505. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Zhang H, Shan W, Jiang A. The meaning of life and health experience for the Chinese elderly with chronic illness: a qualitative study from positive health philosophy. International Journal of Nursing Practice. 2014; 20 (5):530–539. [ PubMed ] [ Google Scholar ]
  • Open access
  • Published: 17 February 2024

Knowledge translation strategies used for sustainability of an evidence-based intervention in child health: a multimethod qualitative study

  • Christine E. Cassidy   ORCID: orcid.org/0000-0001-7770-5058 1 ,
  • Rachel Flynn 2 ,
  • Alyson Campbell 3 ,
  • Lauren Dobson 4 ,
  • Jodi Langley 5 ,
  • Deborah McNeil 7 , 8 ,
  • Ella Milne 4 ,
  • Pilar Zanoni 6 ,
  • Megan Churchill 9 &
  • Karen M. Benzies 8  

BMC Nursing volume  23 , Article number:  125 ( 2024 ) Cite this article

115 Accesses

Metrics details

Sustainability of evidence-based interventions (EBIs) is suboptimal in healthcare. Evidence on how knowledge translation (KT) strategies are used for the sustainability of EBIs in practice is lacking. This study examined what and how KT strategies were used to facilitate the sustainability of Alberta Family Integrated Care (FICare)™, a psychoeducational model of care scaled and spread across 14 neonatal intensive care units, in Alberta, Canada.

First, we conducted an environmental scan of relevant documents to determine the use of KT strategies to support the sustainability of Alberta FICare™. Second, we conducted semi-structured interviews with decision makers and operational leaders to explore what and how KT strategies were used for the sustainability of Alberta FICare™, as well as barriers and facilitators to using the KT strategies for sustainability. We used the Expert Recommendations for Implementation Change (ERIC) taxonomy to code the strategies. Lastly, we facilitated consultation meetings with the Alberta FICare™ leads to share and gain insights and clarification on our findings.

We identified nine KT strategies to facilitate the sustainability of Alberta FICare™: Conduct ongoing training; Identify and prepare local champions; Research co-production; Remind clinicians; Audit and provide feedback; Change record systems; Promote adaptability; Access new funding; and Involve patients/consumers and family members. A significant barrier to the sustainability of Alberta FICare™ was a lack of clarity on who was responsible for the ongoing maintenance of the intervention. A key facilitator to sustainability of Alberta FICare was its alignment with the Maternal, Newborn, Child & Youth Strategic Clinical Network (MNCY SCN) priorities. Co-production between researchers and health system partners in the design, implementation, and scale and spread of Alberta FICare™ was critical to sustainability.

This research highlights the importance of clearly articulating who is responsible for continued championing for the sustainability of EBIs. Additionally, our research demonstrates that the adaptation of interventions must be considered from the onset of implementation so interventions can be tailored to align with contextual barriers for sustainability. Clear guidance is needed to continually support researchers and health system leaders in co-producing strategies that facilitate the long-term sustainability of effective EBIs in practice.

Peer Review reports

Given that the nursing profession represents the largest percentage of the healthcare workforce, nurses have considerable potential to translate evidence into practice and improve patient and health system outcomes [ 1 , 2 ]. Evidence-based interventions (EBIs; e.g., clinical practice guidelines, clinical pathways, innovations, models of care) are useful for translating evidence into nursing practice; however, the availability of EBIs does not guarantee that they will be successfully implemented, adopted, and sustained in practice [ 3 , 4 ]. The field of implementation science has a robust literature on knowledge translation (KT) strategies to promote the implementation of EBIs into practice [ 5 ]. KT strategies are defined as “methods or techniques used to enhance the adoption, implementation, and sustainability of a clinical program or practice” [ 6 ]. Examples of KT strategies include educational approaches, audit and feedback, and clinical champions [ 7 ]. There is an abundance of evidence on the use of KT strategies [ 8 , 9 , 10 , 11 ] for the implementation of various EBIs with different stakeholders (e.g., nurses, physiotherapists, physicians) [ 12 , 13 ], across different health contexts [ 14 , 15 ]. To date, this literature focuses primarily on the use of KT strategies for the implementation process of EBIs into different healthcare contexts. There is limited consolidated empirical evidence on what and how KT strategies are used for the sustainability of EBIs in healthcare institutional settings (e.g., hospitals, long-term care organizations).

Sustainability is conceptualized as both a process and implementation outcome and is a priority issue for health services research [ 16 , 17 ]. Moore et al. describe sustainability as after a defined period of time, the program, clinical intervention, and/or implementation strategies continue to be delivered and/or individual behavior change (i.e., clinician, patient) is maintained; the program and individual behavior change may evolve or adapt while continuing to produce benefits for individuals/systems. The sustainability concept differs from scale up and spread, which Greenhalgh and Papoutsi [ 18 ] define as building infrastructure to support full scale implementation (scale up), and replication of an intervention (spread). Sustainability of EBIs continues to be suboptimal across healthcare institutions, due to the lack of understanding of strategies available to support sustainability [ 19 ]. Our recent scoping review synthesized 25 studies and found that training, education, and the development of interrelationships between researchers and knowledge users are the most common types of KT strategies used to sustain EBIs [ 20 ]. A key finding from our review was the need for clearer description and reporting of KT strategies used for the sustainability of EBIs and research that describes how to use KT strategies to sustain EBIs [ 20 ]. This information is critical to support nurses and nurse leaders to implement and sustain EBIs in a variety of healthcare contexts.

To address the knowledge gaps found in our scoping review, this current study aimed to explore what and how KT strategies are used to facilitate the sustainability of one EBI that has been scaled and spread across the context of Alberta Health Services (AHS), Canada. Given its robust evidence-base and successful implementation across the province of Alberta, Canada, we selected Alberta Family Integrated Care (FICare)™ as the case EBI for this study. Alberta FICare™ is a theoretically driven, psychoeducational model of care that enhances family-centered care practice, driven by the multi-disciplinary team (largely comprised of nurses), and empowers parents of infants admitted to the neonatal intensive care unit (NICU) with knowledge, skills, and confidence to facilitate an earlier discharge home [ 21 , 22 ]. Modeled off a program in Estonia, a model of FICare for level 3 NICUs was first implemented as a pilot study in 2011 at Mount Sinai Hospital in Toronto, ON. Alberta FICare™ was adapted from the level 3 NICU model and subsequently implemented and evaluated in 10 level 2 NICUs across Alberta in a cluster randomized controlled trial (cRCT) [ 23 , 24 , 25 ]. Successful implementation of Alberta FICare™ was shown to decrease length of stay (LOS) by 2.55 days without significant increases in readmissions and emergency department (ED) visits compared to moderate to late preterm infants in a standard care group [ 23 ]. Parents who engaged with Alberta FICare™ reported reduced psychological distress and improved confidence in caring for their infant [ 26 , 27 ]. This increased confidence and positive experience gained from the integration of Alberta FICare™ into practice has the potential to improve infant-parent relationships, which ultimately supports communication skill development in infants [ 21 ], improved neurodevelopment in preterm infants [ 28 ], and increased confidence in parents’ transition home with their infant [ 26 , 27 ]. In 2019, Alberta FICare™ spread and scale was initiated for all 14 NICUs across the province [ 29 ]. Previous research has been conducted to explore barriers and facilitators to implementation of the Alberta FICare™ in clinical practice [ 22 ]; however, no research has been conducted to examine what and how KT strategies were used to facilitate the sustainability of Alberta FICare™ across the province.

Research purpose

This study examined what and how KT strategies were used to facilitate the sustainability of Alberta FICare™ in level II and level III NICUs across Alberta, Canada.

Our research objectives were to:

Identify what and how KT strategies are used to support the sustainability of Alberta FICare™; and.

Understand the perceived barriers and facilitators to using KT strategies for the sustainability of Alberta FICare™.

We conducted a multimethod qualitative study across three sequential phases: (1) environmental scan of relevant documents (policies, guidelines, meeting notes, protocols, etc.) on the use of KT strategies to support the sustainability of Alberta FICare™; (2) key informant interviews with nurses, decision makers, administrators, and operational leaders with experience implementing and sustaining Alberta FICare™; and (3) consultation with the Alberta FICare™ leads to share and gain insights and clarification on our findings. We defined sustainability as use of the EBI beyond 1 year of initial implementation of Alberta FICare™ at the specific site [ 30 ]. Alberta FICare™ was initially implemented in five test sites involved in the cRCT. From there, the EBI was spread and scaled to all control sites involved in the cRCT and remaining NICUs in the province, for a total of 14 NICUs. The researchers responsible for data collection and analysis (CEC, RF, LD, EM, JL) were external to the Alberta Health Services NICU setting and did not have any relationships with participants.

Phase 1: environmental scan

An environmental scan is a passive strategy for externally examining a phenomenon of interest using existing sources of information [ 31 ]. Our environmental scan included a systematic approach to searching relevant documents, extracting data, and synthesizing the findings.

Search strategy

We sourced a range of documents for the environmental scan, including project management plans, open-access journal articles, knowledge user presentations, and meeting documents on initial implementation from the cRCT and scale and spread provided by the Alberta FICare™ Project Team. Further, we explored the AHS website on Alberta FICare™ to identify items related to sustainability. We held two meetings with the Alberta FICare™ Project Team to identify any additional documentation for the environmental scan. During these meetings, it was agreed that any documents that had any personal identification (i.e., names of individuals) would be excluded or de-identified for analysis.

Data extraction

We created a data extraction form in Excel to collect relevant information related to: document source type (protocol, policy, meeting notes, etc.); Authors; Year; Definition of sustainability concept or phase (if reported); Type of KT strategy used according to the Expert Recommendations for Implementing Change (ERIC) taxonomy [ 7 ]; KT strategy description using the Aims, Ingredients, Mechanism, Deliver (AIMD) Framework [ 32 ]; Adaptations/modifications of KT strategy from implementation to sustainability; Reported barriers and facilitators to sustainability; Reported Outcomes. One reviewer (JL) extracted all data using the data extraction form. Collectively, the research team met and reviewed the extracted data to determine any additional information that needed to be extracted.

Data analysis

We produced descriptive numerical summaries of the quantitative data (i.e., frequency of document types, KT strategy, barriers and facilitators, outcomes, etc.). Next, two team members (CC, JL) conducted deductive content analysis to categorize the KT Strategies using the ERIC taxonomy consisting of 73 strategies [ 7 , 33 ]. If data did not map onto the ERIC taxonomy, we coded it under “other”. Findings are reported narratively and in tabular formats.

Phase 2: key informant interviews

The environmental scan was complemented by key informant interviews using a qualitative descriptive design [ 34 ]. The objective of the key informant interviews was to explore KT strategies used to facilitate the sustainability of Alberta FICare™ from the perspectives of nurses, decision makers, administrators, and operational leaders. Ethics approval was granted by the University of Alberta Health Research Ethics Board (CHREB #Pro00116834) and the Covenant Health Research Centre.

Participants

To meet the inclusion criteria of a key informant, participants had to have experience with implementing and sustaining Alberta FICare™. Informants were contacted via email by the Executive Director of the Maternal Newborn Child & Youth (MNCY) Strategic Clinical Network (SCN)™, and a follow up email approximately two and four weeks after if a response was not received. Interested participants contacted the Research Assistant (RA) who arranged an online interview via Zoom.

We developed a semi-structured interview guide based on the Consolidated Framework for Sustainability (CFS) [ 35 ] to explore barriers and facilitators to KT strategy use for the sustainability of Alberta FICare™ (Objective 2; See Appendix 1 for Interview Guide). We included prompts of specific ERIC Taxonomy strategies based on findings from our scoping review of KT strategies used to sustain EBIs. Open-ended questions were also included to explore additional strategies that may not be included in the ERIC Taxonomy. Further, additional questions were posed to seek clarification or additional information based on findings from the environmental scan. The interviews were conducted by two researchers (an RA and principal or co-investigator). Interviews lasted between 45 and 60 min. Participants provided written, informed consent before the interview.

Data management and analysis

Audio-recordings for all interviews were transcribed verbatim and de-identified. Data were managed and analyzed using NVivo 12 [ 36 ]. First, two members of the research team (LD, EM) conducted deductive content analysis [ 33 ] to code similar statements related to KT strategies used for sustaining Alberta FICare™. Strategies were deduced according to the ERIC taxonomy of implementation strategies, which consists of 73 distinct strategies categorized into 9 separate clusters [ 7 ]. If data did not map onto the ERIC taxonomy, we coded it under “other”. Next, we used the CFS to code similar statements related to the barriers and facilitators to using the KT strategies [ 35 ]. Two research team members (LD, EM) cross-referenced their analyses and compared their preliminary findings. Together, they developed a final set of themes and summaries which were reviewed and refined by two other research team members (CC, RF). Any discrepancies were resolved through team discussion with co-leads (CC, RF).

Phase 3: consultation with alberta FICare™ leaders

To enhance the methodological rigour of our environmental scan and key informant interviews, we consulted with key Alberta FICare™ Leads to share our findings, gain insights, and seek clarification. Specifically, we worked closely with the Principal Investigator/Scientific Lead of Alberta FICare™ (KB), the Scientific Director of the AHS MNCY SCN (DM), and Project Manager of Alberta FICare™ (PZ). This consultation involved two virtual meetings to discuss relevant documents for the environmental scan and clarify key findings. The Alberta FICare™ Leads also provided insights on our key findings via written feedback and are co-authors on this paper.

The environmental scan identified three ERIC taxonomy KT strategies [ 7 ] used to facilitate EBI (Alberta FICare™) sustainability: (1) conduct ongoing training; (2) identify and prepare champions; and (3) research co-production with the MNCY SCN (Table  1 ). The training and education strategies targeted all clinicians and unit clerks in the NICU with the goal of increasing the level of knowledge on Alberta FICare™. These strategies included 1–2 h of in-person training for sites involved in the cRCT and transitioned to asynchronous online education modules for sites involved in the scale and spread. The second strategy included managers identifying clinical champions within their care setting to take on a larger role and act as facilitators to support sustainability of Alberta FICare™ at the point-of-care with their nurse colleagues. Clinical champions received 3–4 h of educational training on Alberta FICare™. Lastly, the provincial scale and spread of Alberta FICare™ was completed using a co-production approach with the MNCY SCN to support the sustainability of FICare. This included quarterly fidelity audits and debriefs with Local Site Implementation Teams where co-leads and the project manager answered questions and recommended strategies to strengthen implementation.

Characteristics of participants

Of the five individuals interviewed, four identified themselves as female. The five interviewees held diverse roles in both AHS and Covenant Health. Covenant Health is contracted by AHS to deliver healthcare services and is part of Alberta’s integrated health system [ 37 ]. We interviewed two program managers, one unit manager, one clinical nurse educator, and one clinical project manager. Three participants spoke from the Edmonton Zone and two from the Calgary Zone. Three participants worked in a level 2 NICU and two worked in a level 3 NICU.

Types of KT strategies used for sustainability

From the key informant interviews, all data mapped onto the ERIC Taxonomy. We identified a total of eight distinct ERIC taxonomy KT strategies used to support the sustainability of Alberta FICare™ (Table  2 ). Three KT strategies were the same strategies identified in the environmental scan (conduct ongoing training, identify and prepare champions, research co-production). The types of KT strategies used varied by site; however, staff training was the only strategy reported in all interviews. Following staff training, the two most reported KT strategies were audit and feedback and optimizing record systems to support integration of Alberta FICare™ into the workflow. The eight KT strategies, categorized into the ERIC taxonomy, reported for the sustainability of Alberta FICare™ are:

Conduct ongoing training

Training was delivered in eLearning modules to be completed by multidisciplinary NICU staff. Since implementation, training has continued in various ways and intervals among different sites. The number of training and educational modules varied depending on whether it was targeted towards clinical champions, end users (all other multi-disciplinary staff), or unit clerks. Alberta FICare™ education has been largely integrated into new staff training at many sites. Following orientations, some sites include Alberta FICare™ in their annual orientation, while others complete “ education blitz’s ” (Participant 03). Education strategies also occurs in the form of regular emails.

Remind clinicians

Many participants described the use of posters to facilitate the sustainability of Alberta FICare™, as posters helped to remind clinicians about the intervention and provided resources for staff and parents. For example, at one site, two posters were located in the staff lounge, prompting major principles of Alberta FICare™ and outlining frequently asked questions.

Audit and provide feedback

Formal audit and feedback appeared to vary by site. During scale and spread efforts, formal audit and feedback was conducted quarterly. Audit and feedback were mentioned in the form of site visits (from 1 to 2 members of the Alberta FICare™ Project Team) accompanied by comprehensive written audit reports and brief summary ‘report cards’ based on observational feedback. The audit ‘report cards’ used a green, yellow, and red classification system to describe how the site was doing with use of the EBI components.

Change record systems

With the introduction of Connect Care (an electronic clinical information system) overlapping with the implementation of Alberta FICare™ at most sites, integrating the Parent Education component of Alberta FICare™ into their charting system played a significant role in supporting the sustainability of the new model of care. The integration work with Connect Care is ongoing and specific to the Parent Education component of Alberta FICare™.

Identify and prepare local champions

Participants identified strong clinical champions supported ongoing sustainability of Alberta FICare™. Clinical champions varied in their roles, but included nurse practitioners, clinical project managers, clinical nurse educators, and neonatologists.

Promote adaptability

Adapting the implementation and sustainability approach for Alberta FICare™ was a strategy used during scale and spread to make it easier for sites to implement and sustain the Alberta FICare™ practices. For example, some sites made changes to the layout of the communication whiteboards in patient rooms. Further, at one site, the addition of volunteer services to support the peer support program decreased the impact of Alberta FICare™ on staff workload.

Access new funding

Most participants identified that little resources were required to sustain Alberta FICare™. Some participants put an emphasis on the need for continued financial support to maintain sustainability into the future, including additional resources for staff training.

Involve patients/consumers and family members

Parent feedback was used as a strategy to facilitate sustainability by guiding how Alberta FICare™ should be implemented in the clinical setting. Throughout the scale and spread, parent feedback was mostly collected in the form of parent surveys and real-time feedback during fidelity audit site visits. Further, parents were involved in all planning meetings and co-designed spread and scale resources, including training modules.

Barriers and facilitators to using KT strategies for sustainability

In addition to the eight KT strategies identified in the environmental scan and key informant interviews, we also identified a range of barriers and facilitators to using the identified KT strategies for sustainability. These barriers and facilitators were categorized in all six constructs of the Consolidated Framework for Sustainability [ 35 ], including the people involved ( n  = 3), organizational setting ( n  = 3), resources ( n  = 3), negotiating initiative processes ( n  = 2), the external environment ( n  = 1), and EBI design and delivery ( n  = 1).

While the ongoing training and education was seen as helpful for orienting new staff, participants described barriers to completing the Alberta FICare™ modules. Some participants believed the educational strategies increased staff workload, despite funding available to backfill time for nurses to complete the learning modules. Further, there were some barriers with finding the educational materials which impacted its use, despite the materials being available via the internal website. One participant noted that accessibility is a barrier, explaining that “ if people have to go searching for something, it’s less likely to be used, right? ” (Participant 05).

The placement and size of the staff posters, and thus overall visibility, was reported to facilitate their impact as a reminder on Alberta FICare™. When asked how the posters facilitate the sustainability of Alberta FICare™, one participant explained “ they are at least a constant visual reminder to both parents and staff that this is a—I don’t want to say an expectation, but this is something that’s important to our unit. Just having that constant reminder that it’s there will definitely help with sustainability for sure ” (Participant 03).

Participants described challenges with the type of data collected in audit and feedback activities. The Alberta FICare™ dashboard reported on length of stay, ED visits, and readmissions. Participants described these data as being more useful for administrators and operational leaders than for point-of-care staff. As one participant noted, “ there’s so much that contributes to length of stay as well. You can’t just contribute it to Alberta FICare™ ” (Participant 02).

Integrating Alberta FICare™ into the existing workflow proved to be a barrier, given many sites were implementing this alongside the implementation of an electronic clinical information system (Connect Care). However, there is ongoing work to integrate Alberta FICare™ into the electronic clinical information system, which participants described this as a key facilitator to supporting their workflow, saying “ having a Connect Care line with Alberta FICare™ promotes that Alberta FICare™ to continue and that nurses need to document on it. So that work has been really crucial, I think, in part of building sustainability ” (Participant 02). One example of integration is education points for ‘Parent Participation in Care’ and Bedside Rounds. This allows for providers to document parent integration in their infant’s care and bedside rounds.

Participants noted that although clinical champions are a useful KT strategy for sustainability, it can be difficult to engage clinical champions when it is not a dedicated role, with explicitly dedicated resources. As one participant explained “ To make this sustainable you need resources dedicated to it. And to rely on frontline champions…it is difficult at the best times to get them engaged ” (Participant 02).

A significant barrier to adapting components of Alberta FICare™ to fit in the workplace has been the ambiguity of intervention tools and components. Alberta FICare™ was described as a model of care, and participants found it challenging to know what specific components should be adapted to sites to support sustainability. Some participants felt that it should not be up to the individual sites to develop tools to sustain the EBI, and in fact, there should be a more consistent approach to sustainability.

Participants noted that sustaining Alberta FICare™ has minimal financial requirements; however, funds are needed to support the use of KT strategies for sustainability. One participant explained their health centre foundation is a good resource for funding as they are “ good at supporting family initiatives and things that improve the family and patient experience ” (Participant 02).

The peer family member support program was identified as a key intervention component that also supports the sustainability of Alberta FICare™; however, participants described challenges with engaging parents in the ongoing sustainability of Alberta FICare™. There was a lack of time and opportunity to engage parents in providing feedback. As one participant explained, “ parents are just in a state of crisis when they’re in the NICU’s. The last thing that they want to do is actually fill out a survey ” (Participant 04).

Phase 3: alberta FICare™ lead consultation

Through correspondence with the Alberta FICare™ Leads, we learned of additional methods to enhance the KT strategies used for sustainability. For example, as part of the audit and feedback strategy, the Alberta FICare™ Leads developed an Alberta FICare™ dashboard. The Alberta FICare™ Leads provided additional details on the specific data reported in the dashboard, including length of stay, 7-day readmissions, and 7-day ED visits by site and zone. In relation to the peer family mentor support component of the EBI, COVID-19 delayed full implementation. Upon conclusion of scale and spread research efforts, the MNCY SCN began to assume leadership to support long-term sustainability of Alberta FICare™. There has been the development of a new MNCY role of provincial Family Mentor Clinical Coordinator aimed to complete implementation and support ongoing practice of the Family Mentor component. To accompany the educational strategies, they developed parent- and staff-facing webpages to communicate key details regarding Alberta FICare™ and support ongoing education on the initiative. Lastly, since the environmental scan and key informant interviews were conducted, the Alberta FICare™ team developed a business case to demonstrate the cost benefit of Alberta FICare™ and have since secured 3 years of fixed funding and hired a provincial Practice Lead to coordinate and continue to evaluate, and a Family Mentor Clinical Coordinator to further develop parent support. The Alberta FICare™ Leads presented before two provincial health services committees and secured funding based on the value generated by Alberta FICare™ for the health system and families.

This study aimed to examine what and how KT strategies are used to facilitate the sustainability of Alberta FICare™, an EBI that enhances family-centered care practice and empowers parents of infants admitted to the NICU with knowledge, skills, and confidence to facilitate an earlier discharge home [ 21 , 22 ]. We conducted an environmental scan of relevant documents and key informant interviews with nursing clinical leaders and administrators to identify KT strategies used to sustain Alberta FICare™ and their perceived barriers and facilitators to using the KT strategies. By integrating the two data sources and seeking clarification and insights from Alberta FICare™ Project Leads, our findings provide a more comprehensive overview of how KT strategies are used for sustainability of EBIs. The environmental scan highlighted key KT strategies that were planned from the outset, including online education and clinical nurse champions. The key informant interviews identified additional KT strategies that were used at different sites, although not initially planned from the outset of the project (i.e., integrating components of Alberta FICare™ into the new electronic clinical information system, promoting adaptability). These insights demonstrated how KT strategies were selected and adapted over the sustainability process once an EBI is implemented into real-world practice and integrated into workflow processes. Our findings provide valuable information to support nurses and nurse leaders when selecting KT strategies to implement and sustain EBIs in a variety of clinical settings.

Both the environmental scan and key informant interviews highlighted training and educational strategies as one of the primary KT strategies for supporting sustainability of Alberta FICare™. Environmental scan documents described the use of online, asynchronous education modules for multidisciplinary NICU staff to support the ongoing delivery of Alberta FICare™. Similarly, the key informant interviews described staff education delivered via online learning modules, largely integrated into orientation training for new staff at several sites. The emphasis on educational strategies is not surprising. Our previous systematic review of KT strategies for implementing nursing guidelines identified 36/41 studies that used educational strategies, reporting positive impact on professional practice outcomes, professional knowledge outcomes, patient health status, and resource use outcomes [ 38 ]. Further, our scoping review of KT strategies used for the sustainability of EBIs (including models of care) found 24/25 studies reporting using educational strategies [ 20 ]. Despite educational strategies being the most commonly reported KT strategies, previous research clearly highlights the range of contextual factors influencing sustainability of EBIs, including inadequate staff resourcing and lack of organizational support [ 35 , 39 ], which cannot be addressed by educational strategies alone [ 40 ].

The reported KT strategies were not employed in the same way across all sites represented in this study. For instance, the key informant interviews provided additional details on how educational strategies have been tailored to context-specific barriers and facilitators. Some sites have modified this KT strategy, including integrating educational strategies on Alberta FICare™ into their annual orientation, while others disseminate information in the form of regular emails. While it is important to avoid adaptations to the core EBI components, adapting and tailoring KT strategies to local barriers and facilitators is critical to support ongoing sustainability efforts [ 41 ].

Participants described an ad hoc approach to adaptations of KT strategies that lacked formal guidance. Our findings illustrate the need for clear guidance on if and how KT strategies used for initial implementation can be adapted for use in sustainability. This finding is consistent with previous sustainability studies. Johnson et al. 2019 conducted a qualitative content analysis of implementation studies funded by the United States National Institutes of Health and found that adaptation was not substantively described in their grant proposals [ 42 ]. Further, our scoping review identified a lack of reporting on how KT strategies are adapted from implementation to sustainability [ 20 ]. The lack of clarity on implementation to sustainability makes it challenging for nursing leaders to select, tailor, and use KT strategies for different types of EBIs. To address this gap, improved reporting efforts are needed to describe how KT strategies have been adapted to the local context, which will help to inform nurse leaders to select and tailor KT strategies to support the sustainability of EBIs. Implementation scientists have developed the Framework for Reporting Adaptations and Modifications to EBIs-Implementation Strategies (FRAME-IS), a practical tool for documenting and considering modifications to implementation strategies [ 43 ]. Our findings clearly indicate the need to use this type of reporting tool to expand our understanding of how to adapt implementation strategies into sustainability strategies.

This study demonstrated the value in the research co-production approach used by researchers and the health system [ 44 ]. This partnership was critical for the successful design, implementation, evaluation, and spread and scale of Alberta FICare™ across 14 NICUs in Alberta. However, some participants described Alberta FICare™ as primarily a research project, instead of a healthcare practice and policy change. In the environmental scan and key informant interviews, it was unclear who was primarily responsible for the ongoing maintenance of the EBI. Through the Alberta FICare™ Project Lead consultations, we learned that Alberta FICare™ now has three years of fixed funding, with a provincial Practice Lead to coordinate and continue to evaluate, and a Family Mentor Clinical Coordinator to further develop parent support.

A key strength of Alberta FICare™ is having ongoing, secure funding to support maintenance and ongoing use in practice. However, it is not always clear who is responsible for EBI sustainability in the co-production and sustainability literature. There is a lack of guidance to support researchers and health system leaders to engage in co-production past a research study or when grant funding ends [ 42 ]. Our study highlights several important practical questions for sustainability planning. What role do researchers have in sustainability of EBIs? Is there a distinct handover that has to occur or how does the health system ‘take over’ responsibility once an EBI has been deemed effective and successfully implemented? Other scholars highlight related considerations for sustainability work. Johnson et al.’s study of how researchers conceptualized and planned for the sustainability of health interventions, raised a similar question of who is responsible for sustainability planning, they recommend sustainability planning to be a “dynamic, multifaceted approach with the involvement of all those who have a stake in sustainability such as funders, researchers, practitioners, and program beneficiaries” [ 42 ]. The Alberta FICare™ Project Leads highlight the value in this dynamic, multifaceted approach that allowed them to work with their funders to secure resources to support sustainability. Further, these findings speak to the need for longitudinal research on the sustainability process. Sustainability of EBIs is more than a single snapshot in time, and ongoing evaluation is needed to understand how it works in practice with research co-production partnerships between researchers, health system leaders, and patients and families.

The science on KT strategies is evolving. For this study, we used the 2015 version of the ERIC Taxonomy to guide our data collection and analysis activities [ 7 ]. Since then, an important sustainability science paper has been published where researchers adapted, refined, and extended the ERIC compilation to incorporate an explicit focus on sustainment [ 45 ]. Nathan et al. [ 45 ] found that most ERIC strategies required minor changes, whereas four strategies were significantly revised. Most notably, “develop educational materials” was adapted to “review and update educational materials” which aligns with our findings on the need for ongoing updates to educational materials for Alberta FI-Care™. Overall, our study complements Nathan et al.’s sustainment-explicit ERIC glossary by describing how these strategies support sustainability with practical and illustrative examples from Alberta FI-Care™. Moving forward, efforts are needed to apply this sustainment-explicit ERIC glossary to other EBI projects to further develop our understanding of what and how KT strategies are being used to implement and sustain EBIs.

We identified two conceptual challenges that require further exploration in the implementation and sustainability science literature. First, a challenge with examining sustainability of an EBI is navigating the difference between EBI implementation and sustainability. This study supports the need to shift our perspective of implementation and sustainability to a continuum instead of distinct entities [ 46 ]. Lennox et al.’s systematic review on sustainability approaches in healthcare revealed two distinct conceptualizations of sustainability: (i) Sustainability is a linear process that follows implementation, it is the end goal to be achieved; and (ii) Sustainability is a concurrent process alongside implementation, where the process is to be influenced and adapted over time to impact long-term use of the intervention [ 35 ]. Our study findings highlight the value in a concurrent approach. While Alberta FICare™ was successfully implemented, it is unclear when or how an implementation strategy became a sustainability strategy. Building on the reporting guideline work for implementation researchers, we recommend that researchers also adequately report KT strategies for sustainability, as well as adaptation of KT strategies from implementation to sustainability to support replication by other researchers, clinicians, and implementation practitioners. Such details include KT strategy dose, frequency, mode of delivery, and adaptations from initial implementation efforts to long-term sustainability efforts.

Second, Moore et al. 2017 cite two foundational challenges with the sustainability literature: (i) lack of standard definition and (ii) variety of synonyms used in the literature. Our study findings highlight an additional challenge with terminology; sustainability often gets combined with spread and scale, despite distinct differences [ 30 ]. Greenhalgh and Papoutsi define spread as “replicating an initiative somewhere else” and scale as “building infrastructure to support full scale implementation” [ 18 ]. However, sustainability differs from these two processes and focuses more on the extent to which an EBI can deliver its intended benefits over an extended period of time after external support is terminated [ 47 ]. In our environmental scan, documents primarily described the process for moving from the cRCT towards scale and spread of the EBI into all NICUs in the province. This was a critical process to successfully increase the use of Alberta FICare™ across more healthcare institutions. However, documentation lacked detailed information about KT strategies to facilitate sustainability of the EBI once the EBI had been scaled and spread. Similarly, our key informant interviews reiterated the success of scale and spread but described a lack of clarity of what KT strategies to use to support sustainability over time. Future EBI scale and spread initiatives should also consider sustainability planning from the outset. Further, additional research is needed to understand if sustainability strategies change based on if the focus of the EBI is on spread or on scale.

Nursing implications

There are specific implications from our study for nursing practice and research. We echo Proctor et al.’s calls for a more intentional sustainability research agenda, including advancing the capacity, culture, and mechanisms for sustainability and advancing methods for sustainability research [ 16 ]. Advancing this agenda within the nursing context is critical given the significant role nurses play in the implementation and sustainability of EBIs in healthcare [ 48 , 49 ]. Implementation capacity building is becoming increasingly common given the importance of assessing barriers and facilitators to practice change to inform implementation planning [ 50 ]. However, often these initiatives focus on individual provider behaviors and context of the EBI implementation. Nursing clinicians need tangible tools to support their sustainability planning as well. Capacity building efforts are needed to support nursing practitioners, leaders, and health system administrators to tackle EBI implementation and sustainability on a continuum and start to plan for sustainability from the start of a nursing practice or policy change initiative.

As nursing researchers, it is our role to advance the science of implementation and sustainability and support nurses and administrators to use evidence-based KT strategies in their implementation and sustainability efforts. To do so, further research is needed to build on the implementation science body of knowledge and think about sustainability-specific strategies or how to adapt implementation strategies to be sustainability strategies and support the maintenance of EBIs in nursing practice and policy. We recommend building on existing sustainability frameworks, such as the CFS and the Dynamic Sustainability Framework, to support reporting and testing initiatives of KT strategies for sustainability. Lastly, nursing researchers must work in a research co-production approach to successfully enable sustainability. As our findings indicate, the research partnerships between University of Calgary and the AHS MNCY SCN allowed for rigorous research, scale and spread, and the establishment of secured funding to support ongoing sustainability. The cRCT and process evaluation approach of the Alberta FICare™ provided the evidence to scale and spread the EBI across the province. These were critical steps in advancing the sustainability of the EBI. Oftentimes, sustainability is thought about retrospectively: An EBI is implemented, and now we want to sustain it. We urge researchers, nursing leaders, and health system administrators to work together in prospective sustainability research and pragmatic planning.

Strengths and limitations

Our study findings should be considered with the following limitations in mind. This study was conducted in partnership with health system knowledge users; however, we did not have patient and public involvement in our study. Having patient and public partners on this study would add insights into the relevancy and utility of the KT strategies identified. The study sample for the qualitative interview phase may have missed some important perspectives. We did not interview a key informant from each NICU that implemented Alberta FICare™. As such, we may have missed KT strategies that are being used to facilitate sustainability in different contexts. Further, we did not interview point of care nurses to explore how they are using the EBI in their daily practice. Despite these limitations, we supplemented interviews with the environmental scan document analysis and Alberta FICare™ Project Leads consultation, which allowed for a broader understanding of what and how KT strategies are used to facilitate the sustainability of Alberta FICare™. Further, we used several implementation and sustainability frameworks to map findings onto existing literature on KT strategies.

This multimethod qualitative study explored how KT strategies are used to facilitate the sustainability of an EBI. Using Alberta FICare™ as a case example, we identified a range of KT strategies used for sustainability, including online education, clinical nurse champions, and academic-health system co-production. Our findings illustrate how KT strategies are adapted over the sustainability process once an EBI is implemented into real-world nursing practice. Adaptation of interventions must be considered from the onset of implementation so interventions can be tailored to align with contextual barriers for sustainability. Further, this research highlights the importance of clearly articulating who is responsible for continued championing for the sustainability of EBIs. Clear guidance is needed to continually support researchers and nurse leaders in co-producing strategies that facilitate the long-term sustainability of effective EBIs in nursing practice and policy.

Data availability

All data generated or analysed during this study are included in this published article.

Abbreviations

Alberta Health Services

Aims, ingredients, mechanism, delivery

Consolidated framework for sustainability

Clustered randomized control trial

Evidence-based interventions

Emergency Department

Expert Recommendations for Implementing Change

  • Family Integrated Care

Framework for Reporting Adaptations and Modifications to EBIs-Implementation Strategies

Knowledge Translation

Length of stay

Maternal, Newborn, Child & Youth

Neonatal Intensive Care Unit

Strategic Clinical Network

Mackey A, Bassendowski S. The history of evidence-based practice in nursing education and practice. J Prof Nurs. 2017;33(1):51–5.

Article   PubMed   Google Scholar  

World Health Organization. Nursing and midwifery. World Health Organization. 2022. Available from: https://www.who.int/news-room/fact-sheets/detail/nursing-and-midwifery .

Brownson R, Colditz G, Proctor E. Dissemination and Implementation Research in Health. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. 1st ed. Oxford University Press; 2017 [cited 2023 Jul 18]. p. 1–520. Available from: https://global.oup.com/academic/product/dissemination-and-implementation-research-in-health-9780197660690 .

Fischer F, Lange K, Klose K, Greiner W, Kraemer A. Barriers and strategies in Guideline Implementation-A scoping review. Healthc Basel Switz. 2016;4(3):36.

Google Scholar  

Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7(1):50.

Article   PubMed   PubMed Central   Google Scholar  

Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci IS. 2013;8:139.

Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert recommendations for Implementing Change (ERIC) study. Implement Sci IS. 2015;10:109.

Yamada J, Shorkey A, Barwick M, Widger K, Stevens BJ. The effectiveness of toolkits as knowledge translation strategies for integrating evidence into clinical care: a systematic review. BMJ Open. 2015;5(4):e006808–8.

Abdullah G, Rossy D, Ploeg J, Davies B, Higuchi K, Sikora L, et al. Measuring the effectiveness of mentoring as a knowledge translation intervention for implementing empirical evidence: a systematic review. Worldviews Evid Based Nurs. 2014;11(5):284–300.

Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;(6):CD000259.

Squires JE, Sullivan K, Eccles MP, Worswick J, Grimshaw JM. Are multifaceted interventions more effective than single-component interventions in changing health-care professionals’ behaviours? An overview of systematic reviews. Implement Sci IS. 2014;9:152.

Thompson DS, Estabrooks CA, Scott-Findlay S, Moore K, Wallin L. Interventions aimed at increasing research use in nursing: a systematic review. Implement Sci. 2007;2(1):15.

Thompson C, Stapley S. Do educational interventions improve nurses’ clinical decision making and judgement? A systematic review. Int J Nurs Stud. 2011;48(7):881–93.

Yost J, Ganann R, Thompson D, Aloweni F, Newman K, Hazzan A, et al. The effectiveness of knowledge translation interventions for promoting evidence-informed decision-making among nurses in tertiary care: a systematic review and meta-analysis. Implement Sci IS. 2015;10:98.

LaRocca R, Yost J, Dobbins M, Ciliska D, Butt M. The effectiveness of knowledge translation strategies used in public health: a systematic review. BMC Public Health. 2012;12(1):751.

Proctor E, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, et al. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implement Sci. 2015;10(1):88.

Braithwaite J, Ludlow K, Testa L, Herkes J, Augustsson H, Lamprell G, et al. Built to last? The sustainability of healthcare system improvements, programmes and interventions: a systematic integrative review. BMJ Open. 2020;10(6):e036453.

Greenhalgh T, Papoutsi C. Spreading and scaling up innovation and improvement. BMJ. 2019;l2068.

Hailemariam M, Bustos T, Montgomery B, Barajas R, Evans LB, Drahota A. Evidence-based intervention sustainability strategies: a systematic review. Implement Sci. 2019;14(1):57.

Flynn R, Cassidy C, Dobson L, Al-Rassi J, Langley J, Swindle J, et al. Knowledge translation strategies to support the sustainability of evidence-based interventions in healthcare: a scoping review. Implement Sci. 2023;18(1):69.

Moe AM, Kurilova J, Afzal AR, Benzies KM. Effects of Alberta Family Integrated Care (FICare) on Preterm Infant Development: two studies at 2 months and between 6 and 24 months corrected age. J Clin Med. 2022;11(6):1684.

Zanoni P, Scime NV, Benzies K, McNeil DA, Mrklas K. Facilitators and barriers to implementation of Alberta family integrated care (FICare) in level II neonatal intensive care units: a qualitative process evaluation substudy of a multicentre cluster-randomised controlled trial using the consolidated framework for implementation research. BMJ Open. 2021;11(10):e054938.

Benzies KM, Aziz K, Shah V, Faris P, Isaranuwatchai W, Scotland J, et al. Effectiveness of Alberta Family Integrated Care on infant length of stay in level II neonatal intensive care units: a cluster randomized controlled trial. BMC Pediatr. 2020;20(1):535.

O’Brien K, Bracht M, Macdonell K, McBride T, Robson K, O’Leary L, et al. A pilot cohort analytic study of Family Integrated Care in a Canadian neonatal intensive care unit. BMC Pregnancy Childbirth. 2013;13(Suppl 1):12.

Article   Google Scholar  

O’Brien K, Robson K, Bracht M, Cruz M, Lui K, Alvaro R, et al. Effectiveness of Family Integrated Care in neonatal intensive care units on infant and parent outcomes: a multicentre, multinational, cluster-randomised controlled trial. Lancet Child Adolesc Health. 2018;2(4):245–54.

Dien R, Benzies KM, Zanoni P, Kurilova J. Alberta Family Integrated Care ™ and Standard Care: a qualitative study of mothers’ experiences of their journeying to Home from the neonatal Intensive Care Unit. Glob Qual Nurs Res. 2022;9:23333936221097113.

PubMed   PubMed Central   Google Scholar  

Shafey A, Benzies K, Amin R, Stelfox HT, Shah V. Fathers’ experiences in Alberta Family Integrated Care: a qualitative study. J Perinat Neonatal Nurs. 2022;36(4):371–9.

Murphy M, Shah V, Benzies K. Effectiveness of Alberta Family-Integrated Care on neonatal outcomes: a Cluster Randomized Controlled Trial. J Clin Med. 2021;10(24):5871.

Wasylak T, Benzies K, McNeil D, Zanoni P, Osiowy K, Mullie T, et al. Creating Value through Learning Health systems: the Alberta Strategic Clinical Network Experience. Nurs Adm Q. 2023;47(1):20–30.

Moore JE, Mascarenhas A, Bain J, Straus SE. Developing a comprehensive definition of sustainability. Implement Sci. 2017;12(1):1–8.

Graham P, Evitts T, Thomas-MacLean R. Environmental scans: how useful are they for primary care research? Can Fam Physician Med Fam Can. 2008;54(7):1022–3.

The AIMD Writing/Working Group, Bragge P, Grimshaw JM, Lokker C, Colquhoun H. AIMD - a validated, simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. BMC Med Res Methodol. 2017;17(1):38.

Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–88.

Doyle L, McCabe C, Keogh B, Brady A, McCann M. An overview of the qualitative descriptive design within nursing research. J Res Nurs. 2020;25(5):443–55.

Lennox L, Maher L, Reed J. Navigating the sustainability landscape: a systematic review of sustainability approaches in healthcare. Implement Sci. 2018;13(1):1–17.

Lumivero Ltd. QIP. NVivo (Version 12) - Lumivero. lumivero.com; 2018 [cited 2023 Mar 23]. Available from: https://lumivero.com/products/nvivo/ .

Home.| Covenant Health. [cited 2024 Jan 17]. Available from: https://www.covenanthealth.ca/ .

Cassidy CE, Harrison MB, Godfrey C, Nincic V, Khan PA, Oakley P, et al. Use and effects of implementation strategies for practice guidelines in nursing: a systematic review. Implement Sci. 2021;16(1):102.

Cowie J, Nicoll A, Dimova ED, Campbell P, Duncan EA. The barriers and facilitators influencing the sustainability of hospital-based interventions: a systematic review. BMC Health Serv Res. 2020;20(1):588.

Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14(1):1–15.

Metz A, Kainz K, Boaz A. Intervening for sustainable change: Tailoring strategies to align with values and principles of communities. Front Health Serv. 2023 [cited 2023 Jul 13];2. Available from: https://www.frontiersin.org/articles/ https://doi.org/10.3389/frhs.2022.959386 .

Johnson AM, Moore JE, Chambers DA, Rup J, Dinyarian C, Straus SE. How do researchers conceptualize and plan for the sustainability of their NIH R01 implementation projects? Implement Sci. 2019;14(1):1–9.

Miller CJ, Barnett ML, Baumann AA, Gutner CA, Wiltsey-Stirman S. The FRAME-IS: a framework for documenting modifications to implementation strategies in healthcare. Implement Sci. 2021;16(1):36.

Graham ID, McCutcheon C, Kothari A. Exploring the frontiers of research co-production: the Integrated Knowledge Translation Research Network concept papers. Health Res Policy Syst. 2019;17(1):88. s12961-019-0501–7.

Nathan N, Powell BJ, Shelton RC, Laur CV, Wolfenden L, Hailemariam M et al. Do the Expert Recommendations for Implementing Change (ERIC) strategies adequately address sustainment? Front Health Serv. 2022 [cited 2024 Jan 17];2. Available from: https://www.frontiersin.org/articles/ https://doi.org/10.3389/frhs.2022.905909 .

Nadalin Penno L, Davies B, Graham ID, Backman C, MacDonald I, Bain J, et al. Identifying relevant concepts and factors for the sustainability of evidence-based practices within acute care contexts: a systematic review and theory analysis of selected sustainability frameworks. Implement Sci. 2019;14(1):108.

Rabin BA, Brownson RC, Haire-Joshu D, Kreuter MW, Weaver NL. A glossary for dissemination and Implementation Research in Health. J Public Health Manag Pract. 2008;14(2):117.

JONA: Journal of Nursing Administration. May 2021 Vol.51 Issue 5| NursingCenter. 2021 [cited 2023 Jul 23]. Available from: https://www.nursingcenter.com/journalissue?Journal_ID=54024&Issue_ID=5846553

Fleiszer AR, Semenic SE, Ritchie JA, Richer MC, Denis JL. Nursing unit leaders’ influence on the long-term sustainability of evidence-based practice improvements. J Nurs Manag. 2016;24(3):309–18.

Davis R, D’Lima D. Building capacity in dissemination and implementation science: a systematic review of the academic literature on teaching and training initiatives. Implement Sci. 2020;15(1):1–26.

Download references

Acknowledgements

This study was funded by the University of Alberta Faculty of Nursing Endowment Fund for the Future: Support for the Advancement of Scholarship Research Fund.

Author information

Authors and affiliations.

School of Nursing, Faculty of Health, Dalhousie University, 5869 University Avenue, B3H 4R2, Halifax, NS, PO Box 15000, Canada

Christine E. Cassidy

School of Nursing and Midwifery, Brookfield Health Sciences Complex, University College of Cork, College Road, T12 AK54, Cork, Ireland

Rachel Flynn

Faculty of Nursing, University of Prince Edward Island, 550 University Avenue, HSB Room 116, C1A 4P3, Charlottetown, PE, Canada

Alyson Campbell

Faculty of Nursing, Edmonton Clinic Health Academy, University of Alberta, Level 3, 11405 87 Avenue, T6G 1C9, Edmonton, AB, Canada

Lauren Dobson & Ella Milne

Faculty of Health, Dalhousie University, 5790 University Avenue, B3H 1V7, Halifax, NS, Canada

Jodi Langley

Faculty of Nursing , University of Calgary, 2500 University Drive NW, T2N 1N4, Calgary, AB, Canada

Pilar Zanoni

Strategic Clinical Networks, Alberta Health Services, 10101 Southport Road SW, T2W 3N2, Calgary, AB, Canada

Deborah McNeil

Faculty of Nursing, Departments of Pediatrics and Community Health Science, Cumming School of Medicine, University of Calgary, 2500 University Drive NW, T2N 1N4, Calgary, AB, Canada

Deborah McNeil & Karen M. Benzies

Department of Pediatrics, IWK Health, 5980 University Ave #5850, B3K 6R8, Halifax, NS, Canada

Megan Churchill

You can also search for this author in PubMed   Google Scholar

Contributions

CC and RF developed the research question. CC, RF, and AC designed the study. LD, JL, EM, and MC collected data and drafted components of the manuscript while supervised by CC and RF. DM, PZ, and KB supported data collection and data analysis procedures, and advised on data interpretation. All authors collaborated on writing the manuscript, and all approved the final version.

Corresponding author

Correspondence to Christine E. Cassidy .

Ethics declarations

Ethics approval and consent to participate.

Ethics approval was granted by the University of Alberta Health Research Ethics Board (CHREB #Pro00116834) and the Covenant Health Research Centre. Informed consent was obtained from all participants involved in this study.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Cassidy, C.E., Flynn, R., Campbell, A. et al. Knowledge translation strategies used for sustainability of an evidence-based intervention in child health: a multimethod qualitative study. BMC Nurs 23 , 125 (2024). https://doi.org/10.1186/s12912-024-01777-4

Download citation

Received : 23 October 2023

Accepted : 30 January 2024

Published : 17 February 2024

DOI : https://doi.org/10.1186/s12912-024-01777-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Sustainability
  • Implementation science
  • Knowledge translation

BMC Nursing

ISSN: 1472-6955

what is document review in qualitative research

IMAGES

  1. what is document analysis in qualitative research

    what is document review in qualitative research

  2. what is document analysis in qualitative research

    what is document review in qualitative research

  3. [PDF] Document Analysis as a Qualitative Research Method

    what is document review in qualitative research

  4. Qualitative Research

    what is document review in qualitative research

  5. Literature Review For Qualitative Research

    what is document review in qualitative research

  6. Qualitative Research: Definition, Types, Methods and Examples

    what is document review in qualitative research

VIDEO

  1. Qualitative and Quantitative Research Methods

  2. Document Analysis Technique #research #researchmethodology #track2training

  3. Small document review

  4. Orientation Document Review and Customer Care Info procedures process

  5. What to Expect

  6. Complex document review with split/combine on scriptit.app

COMMENTS

  1. Document Review as a Qualitative Research Data Collection Method for

    Abstract. This case study is an introduction to the use of a document checklist as part of a qualitative research method of document review. Details are provided on the types of documents that can be utilized in a document review; the advantages and disadvantages of using document review as a research data collection method; description of how to design, implement, analyze, and present ...

  2. Document Analysis as a Qualitative Research Method

    Document review includes the selection of documents (superficial review), reading (comprehensive review) and interpretation processes (Bowen, 2009 ). According to Yıldırım and Şimşek...

  3. Document Analysis

    As a qualitative method, document analysis is defined as a systematic procedure for reviewing and evaluating documents that entails finding, selecting, appraising (making sense of), and synthesizing data contained within them (Bowen, 2009 ).

  4. Conducting a Qualitative Document Analysis

    The Qualitative Report Volume 27 Number 1 How To Article 4 1-3-2022 Conducting a Qualitative Document Analysis Hani Morgan University of Southern Mississippi, [email protected] Follow this and additional works at: https://nsuworks.nova.edu/tqr Part of the Quantitative, Qualitative, Comparative, and Historical Methodologies Commons

  5. Document Analysis

    Document analysis is a versatile method in qualitative research that offers a lens into the intricate layers of meaning, context, and perspective found within textual materials. Through careful and systematic examination, it unveils the richness and depth of the information housed in documents, providing a unique dimension to research findings.

  6. How to use and assess qualitative research methods

    Qualitative research is defined as "the study of the nature of phenomena", including "their quality, ... Document study (also called document analysis) refers to the review by the researcher of written materials . These can include personal and non-personal documents such as archives, annual reports, guidelines, policy documents, diaries ...

  7. Document analysis in health policy research: the READ approach

    Document analysis (also called document review) is one of the most commonly used methods in health policy research; it is nearly impossible to conduct policy research without it.

  8. Document Analysis as a Qualitative Research Method

    72073 Abstract This article examines the function of documents as a data source in qualitative research and discusses document analysis procedure in the context of actual research experiences. Targeted to research novices, the article takes a nuts‐and‐bolts approach to document analysis.

  9. Qualitative document analysis

    Qualitative text or document analysis has evolved into one of the most used qualitative methods across several disciplines ( Kuckartz, 2014 & Mayring, 2010). Its straightforward structure and procedure enable the researcher to adapt the method to his or her special case - nearly to every need.

  10. "Conducting a Qualitative Document Analysis" by Hani Morgan

    Conducting a document analysis can also reduce some of the ethical concerns associated with other qualitative methods. Since document analysis is a valuable research method, one would expect to find a wide variety of literature on this topic. Unfortunately, the literature on documentary research is scant.

  11. PDF Evaluation Briefs No 18

    Evaluation Briefs No. 18 | updated August 2018 This brief describes document review as a data collection method for evaluation. It includes a basic overview of document review; when to use it; how to plan and conduct it; and its advantages and disadvantages. What is document review?

  12. Qualitative Study

    Qualitative research is a type of research that explores and provides deeper insights into real-world problems. [1] Instead of collecting numerical data points or intervene or introduce treatments just like in quantitative research, qualitative research helps generate hypotheses as well as further investigate and understand quantitative data.

  13. Data collection methods for evaluation: Document review

    Guide Resource link Data collection methods for evaluation - document review (PDF, 162KB) This resource from the Centers for Disease Control and Prevention (CDC) provides a brief guide to using document review as a data collection method for evaluation.

  14. Conducting a Qualitative Document Analysis

    Document analysis has been an underused approach to qualitative research. This approach can be valuable for various reasons. When used to analyze pre-existing texts, this method allows researchers to conduct studies they might otherwise not be able to complete. Some researchers may not have the resources or time needed to do field research. Although videoconferencing technology and other types ...

  15. Document Review as a Qualitative Research Data Collection Method for

    DOI: 10.4135/9781473957435 Corpus ID: 63132407; Document Review as a Qualitative Research Data Collection Method for Teacher Research @inproceedings{Bretschneider2017DocumentRA, title={Document Review as a Qualitative Research Data Collection Method for Teacher Research}, author={Pamela J. Bretschneider and Stefanie Cirilli and Tracey Jones and Shannon Lynch and Natalie Wilson}, year={2017 ...

  16. Criteria for Good Qualitative Research: A Comprehensive Review

    This review aims to synthesize a published set of evaluative criteria for good qualitative research. The aim is to shed light on existing standards for assessing the rigor of qualitative research encompassing a range of epistemological and ontological standpoints. Using a systematic search strategy, published journal articles that deliberate criteria for rigorous research were identified. Then ...

  17. Literature review as a research methodology: An ...

    As mentioned previously, there are a number of existing guidelines for literature reviews. Depending on the methodology needed to achieve the purpose of the review, all types can be helpful and appropriate to reach a specific goal (for examples, please see Table 1).These approaches can be qualitative, quantitative, or have a mixed design depending on the phase of the review.

  18. Documentary Analysis

    Documentary analysis, also referred to as document analysis, is a systematic procedure for reviewing or evaluating documents. This method involves a detailed review of the documents to extract themes or patterns relevant to the research topic.

  19. What Is Qualitative Research?

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research.

  20. Guidance on Conducting a Systematic Literature Review

    Literature review is an essential feature of academic research. Fundamentally, knowledge advancement must be built on prior existing work. To push the knowledge frontier, we must know where the frontier is. By reviewing relevant literature, we understand the breadth and depth of the existing body of work and identify gaps to explore.

  21. Full article: A practical guide to reflexivity in qualitative research

    Qualitative research relies on nuanced judgements that require researcher reflexivity, yet reflexivity is often addressed superficially or overlooked completely during the research process. In this AMEE Guide, we define reflexivity as a set of continuous, collaborative, and multifaceted practices through which researchers self-consciously ...

  22. Document analysis in health policy research: the READ approach

    Document analysis (also called document review) is one of the most commonly used methods in health policy research; it is nearly impossible to conduct policy research without it. Writing in early 20th century, Weber (2015) identified the importance of formal, written documents as a key characteristic of the bureaucracies by which modern ...

  23. Rapid reviews methods series: guidance on rapid qualitative evidence

    Setting the review question and topic refinement. Rapid reviews summarise information from multiple research studies to produce evidence for 'the public, researchers, policymakers and funders in a systematic, resource-efficient manner'.16 Involvement of knowledge users is critical.3 Given time constraints, individual knowledge users could be asked only to feedback on very specific ...

  24. Research Guides: Literature Reviews: Concept Mapping

    Literature Reviews. A concept map or mind map is a visual representation of knowledge that illustrates relationships between concepts or ideas. It is a tool for organizing and representing information in a hierarchical and interconnected manner. At its core, a concept map consists of nodes, which represent individual concepts or ideas, and ...

  25. Characteristics of Qualitative Descriptive Studies: A Systematic Review

    Qualitative description (QD) is a label used in qualitative research for studies which are descriptive in nature, particularly for examining health care and nursing-related phenomena ( Polit & Beck, 2009, 2014 ).

  26. Knowledge translation strategies used for sustainability of an evidence

    We conducted a multimethod qualitative study across three sequential phases: (1) environmental scan of relevant documents (policies, guidelines, meeting notes, protocols, etc.) on the use of KT strategies to support the sustainability of Alberta FICare™; (2) key informant interviews with nurses, decision makers, administrators, and operational leaders with experience implementing and ...