- Open access
- Published: 27 November 2020
Designing process evaluations using case study to explore the context of complex interventions evaluated in trials
- Aileen Grant 1 ,
- Carol Bugge 2 &
- Mary Wells 3
Trials volume 21 , Article number: 982 ( 2020 ) Cite this article
Process evaluations are an important component of an effectiveness evaluation as they focus on understanding the relationship between interventions and context to explain how and why interventions work or fail, and whether they can be transferred to other settings and populations. However, historically, context has not been sufficiently explored and reported resulting in the poor uptake of trial results. Therefore, suitable methodologies are needed to guide the investigation of context. Case study is one appropriate methodology, but there is little guidance about what case study design can offer the study of context in trials. We address this gap in the literature by presenting a number of important considerations for process evaluation using a case study design.
In this paper, we define context, the relationship between complex interventions and context, and describe case study design methodology. A well-designed process evaluation using case study should consider the following core components: the purpose; definition of the intervention; the trial design, the case, the theories or logic models underpinning the intervention, the sampling approach and the conceptual or theoretical framework. We describe each of these in detail and highlight with examples from recently published process evaluations.
There are a number of approaches to process evaluation design in the literature; however, there is a paucity of research on what case study design can offer process evaluations. We argue that case study is one of the best research designs to underpin process evaluations, to capture the dynamic and complex relationship between intervention and context during implementation. We provide a comprehensive overview of the issues for process evaluation design to consider when using a case study design.
DQIP - ClinicalTrials.gov number, NCT01425502 - OPAL - ISRCTN57746448
Peer Review reports
Contribution to the literature
We illustrate how case study methodology can explore the complex, dynamic and uncertain relationship between context and interventions within trials.
We depict different case study designs and illustrate there is not one formula and that design needs to be tailored to the context and trial design.
Case study can support comparisons between intervention and control arms and between cases within arms to uncover and explain differences in detail.
We argue that case study can illustrate how components have evolved and been redefined through implementation.
Key issues for consideration in case study design within process evaluations are presented and illustrated with examples.
Process evaluations are an important component of an effectiveness evaluation as they focus on understanding the relationship between interventions and context to explain how and why interventions work or fail and whether they can be transferred to other settings and populations. However, historically, not all trials have had a process evaluation component, nor have they sufficiently reported aspects of context, resulting in poor uptake of trial findings [ 1 ]. Considerations of context are often absent from published process evaluations, with few studies acknowledging, taking account of or describing context during implementation, or assessing the impact of context on implementation [ 2 , 3 ]. At present, evidence from trials is not being used in a timely manner [ 4 , 5 ], and this can negatively impact on patient benefit and experience [ 6 ]. It takes on average 17 years for knowledge from research to be implemented into practice [ 7 ]. Suitable methodologies are therefore needed that allow for context to be exposed; one appropriate methodological approach is case study [ 8 , 9 ].
In 2015, the Medical Research Council (MRC) published guidance for process evaluations [ 10 ]. This was a key milestone in legitimising as well as providing tools, methods and a framework for conducting process evaluations. Nevertheless, as with all guidance, there is a need for reflection, challenge and refinement. There have been a number of critiques of the MRC guidance, including that interventions should be considered as events in systems [ 11 , 12 , 13 , 14 ]; a need for better use, critique and development of theories [ 15 , 16 , 17 ]; and a need for more guidance on integrating qualitative and quantitative data [ 18 , 19 ]. Although the MRC process evaluation guidance does consider appropriate qualitative and quantitative methods, it does not mention case study design and what it can offer the study of context in trials.
The case study methodology is ideally suited to real-world, sustainable intervention development and evaluation because it can explore and examine contemporary complex phenomena, in depth, in numerous contexts and using multiple sources of data [ 8 ]. Case study design can capture the complexity of the case, the relationship between the intervention and the context and how the intervention worked (or not) [ 8 ]. There are a number of textbooks on a case study within the social science fields [ 8 , 9 , 20 ], but there are no case study textbooks and a paucity of useful texts on how to design, conduct and report case study within the health arena. Few examples exist within the trial design and evaluation literature [ 3 , 21 ]. Therefore, guidance to enable well-designed process evaluations using case study methodology is required.
We aim to address the gap in the literature by presenting a number of important considerations for process evaluation using a case study design. First, we define the context and describe the relationship between complex health interventions and context.
What is context?
While there is growing recognition that context interacts with the intervention to impact on the intervention’s effectiveness [ 22 ], context is still poorly defined and conceptualised. There are a number of different definitions in the literature, but as Bate et al. explained ‘almost universally, we find context to be an overworked word in everyday dialogue but a massively understudied and misunderstood concept’ [ 23 ]. Ovretveit defines context as ‘everything the intervention is not’ [ 24 ]. This last definition is used by the MRC framework for process evaluations [ 25 ]; however; the problem with this definition is that it is highly dependent on how the intervention is defined. We have found Pfadenhauer et al.’s definition useful:
Context is conceptualised as a set of characteristics and circumstances that consist of active and unique factors that surround the implementation. As such it is not a backdrop for implementation but interacts, influences, modifies and facilitates or constrains the intervention and its implementation. Context is usually considered in relation to an intervention or object, with which it actively interacts. A boundary between the concepts of context and setting is discernible: setting refers to the physical, specific location in which the intervention is put into practice. Context is much more versatile, embracing not only the setting but also roles, interactions and relationships [ 22 ].
Traditionally, context has been conceptualised in terms of barriers and facilitators, but what is a barrier in one context may be a facilitator in another, so it is the relationship and dynamics between the intervention and context which are the most important [ 26 ]. There is a need for empirical research to really understand how different contextual factors relate to each other and to the intervention. At present, research studies often list common contextual factors, but without a depth of meaning and understanding, such as government or health board policies, organisational structures, professional and patient attitudes, behaviours and beliefs [ 27 ]. The case study methodology is well placed to understand the relationship between context and intervention where these boundaries may not be clearly evident. It offers a means of unpicking the contextual conditions which are pertinent to effective implementation.
The relationship between complex health interventions and context
Health interventions are generally made up of a number of different components and are considered complex due to the influence of context on their implementation and outcomes [ 3 , 28 ]. Complex interventions are often reliant on the engagement of practitioners and patients, so their attitudes, behaviours, beliefs and cultures influence whether and how an intervention is effective or not. Interventions are context-sensitive; they interact with the environment in which they are implemented. In fact, many argue that interventions are a product of their context, and indeed, outcomes are likely to be a product of the intervention and its context [ 3 , 29 ]. Within a trial, there is also the influence of the research context too—so the observed outcome could be due to the intervention alone, elements of the context within which the intervention is being delivered, elements of the research process or a combination of all three. Therefore, it can be difficult and unhelpful to separate the intervention from the context within which it was evaluated because the intervention and context are likely to have evolved together over time. As a result, the same intervention can look and behave differently in different contexts, so it is important this is known, understood and reported [ 3 ]. Finally, the intervention context is dynamic; the people, organisations and systems change over time, [ 3 ] which requires practitioners and patients to respond, and they may do this by adapting the intervention or contextual factors. So, to enable researchers to replicate successful interventions, or to explain why the intervention was not successful, it is not enough to describe the components of the intervention, they need to be described by their relationship to their context and resources [ 3 , 28 ].
What is a case study?
Case study methodology aims to provide an in-depth, holistic, balanced, detailed and complete picture of complex contemporary phenomena in its natural context [ 8 , 9 , 20 ]. In this case, the phenomena are the implementation of complex interventions in a trial. Case study methodology takes the view that the phenomena can be more than the sum of their parts and have to be understood as a whole [ 30 ]. It is differentiated from a clinical case study by its analytical focus [ 20 ].
The methodology is particularly useful when linked to trials because some of the features of the design naturally fill the gaps in knowledge generated by trials. Given the methodological focus on understanding phenomena in the round, case study methodology is typified by the use of multiple sources of data, which are more commonly qualitatively guided [ 31 ]. The case study methodology is not epistemologically specific, like realist evaluation, and can be used with different epistemologies [ 32 ], and with different theories, such as Normalisation Process Theory (which explores how staff work together to implement a new intervention) or the Consolidated Framework for Implementation Research (which provides a menu of constructs associated with effective implementation) [ 33 , 34 , 35 ]. Realist evaluation can be used to explore the relationship between context, mechanism and outcome, but case study differs from realist evaluation by its focus on a holistic and in-depth understanding of the relationship between an intervention and the contemporary context in which it was implemented [ 36 ]. Case study enables researchers to choose epistemologies and theories which suit the nature of the enquiry and their theoretical preferences.
Designing a process evaluation using case study
An important part of any study is the research design. Due to their varied philosophical positions, the seminal authors in the field of case study have different epistemic views as to how a case study should be conducted [ 8 , 9 ]. Stake takes an interpretative approach (interested in how people make sense of their world), and Yin has more positivistic leanings, arguing for objectivity, validity and generalisability [ 8 , 9 ].
Regardless of the philosophical background, a well-designed process evaluation using case study should consider the following core components: the purpose; the definition of the intervention, the trial design, the case, and the theories or logic models underpinning the intervention; the sampling approach; and the conceptual or theoretical framework [ 8 , 9 , 20 , 31 , 33 ]. We now discuss these critical components in turn, with reference to two process evaluations that used case study design, the DQIP and OPAL studies [ 21 , 37 , 38 , 39 , 40 , 41 ].
The purpose of a process evaluation is to evaluate and explain the relationship between the intervention and its components, to context and outcome. It can help inform judgements about validity (by exploring the intervention components and their relationship with one another (construct validity), the connections between intervention and outcomes (internal validity) and the relationship between intervention and context (external validity)). It can also distinguish between implementation failure (where the intervention is poorly delivered) and intervention failure (intervention design is flawed) [ 42 , 43 ]. By using a case study to explicitly understand the relationship between context and the intervention during implementation, the process evaluation can explain the intervention effects and the potential generalisability and optimisation into routine practice [ 44 ].
The DQIP process evaluation aimed to qualitatively explore how patients and GP practices responded to an intervention designed to reduce high-risk prescribing of nonsteroidal anti-inflammatory drugs (NSAIDs) and/or antiplatelet agents (see Table 1 ) and quantitatively examine how change in high-risk prescribing was associated with practice characteristics and implementation processes. The OPAL process evaluation (see Table 2 ) aimed to quantitatively understand the factors which influenced the effectiveness of a pelvic floor muscle training intervention for women with urinary incontinence and qualitatively explore the participants’ experiences of treatment and adherence.
Defining the intervention and exploring the theories or assumptions underpinning the intervention design
Process evaluations should also explore the utility of the theories or assumptions underpinning intervention design [ 49 ]. Not all theories underpinning interventions are based on a formal theory, but they based on assumptions as to how the intervention is expected to work. These can be depicted as a logic model or theory of change [ 25 ]. To capture how the intervention and context evolve requires the intervention and its expected mechanisms to be clearly defined at the outset [ 50 ]. Hawe and colleagues recommend defining interventions by function (what processes make the intervention work) rather than form (what is delivered) [ 51 ]. However, in some cases, it may be useful to know if some of the components are redundant in certain contexts or if there is a synergistic effect between all the intervention components.
The DQIP trial delivered two interventions, one intervention was delivered to professionals with high fidelity and then professionals delivered the other intervention to patients by form rather than function allowing adaptations to the local context as appropriate. The assumptions underpinning intervention delivery were prespecified in a logic model published in the process evaluation protocol [ 52 ].
Case study is well placed to challenge or reinforce the theoretical assumptions or redefine these based on the relationship between the intervention and context. Yin advocates the use of theoretical propositions; these direct attention to specific aspects of the study for investigation [ 8 ] can be based on the underlying assumptions and tested during the course of the process evaluation. In case studies, using an epistemic position more aligned with Yin can enable research questions to be designed, which seek to expose patterns of unanticipated as well as expected relationships [ 9 ]. The OPAL trial was more closely aligned with Yin, where the research team predefined some of their theoretical assumptions, based on how the intervention was expected to work. The relevant parts of the data analysis then drew on data to support or refute the theoretical propositions. This was particularly useful for the trial as the prespecified theoretical propositions linked to the mechanisms of action on which the intervention was anticipated to have an effect (or not).
Tailoring to the trial design
Process evaluations need to be tailored to the trial, the intervention and the outcomes being measured [ 45 ]. For example, in a stepped wedge design (where the intervention is delivered in a phased manner), researchers should try to ensure process data are captured at relevant time points or in a two-arm or multiple arm trial, ensure data is collected from the control group(s) as well as the intervention group(s). In the DQIP trial, a stepped wedge trial, at least one process evaluation case, was sampled per cohort. Trials often continue to measure outcomes after delivery of the intervention has ceased, so researchers should also consider capturing ‘follow-up’ data on contextual factors, which may continue to influence the outcome measure. The OPAL trial had two active treatment arms so collected process data from both arms. In addition, as the trial was interested in long-term adherence, the trial and the process evaluation collected data from participants for 2 years after the intervention was initially delivered, providing 24 months follow-up data, in line with the primary outcome for the trial.
Defining the case
Case studies can include single or multiple cases in their design. Single case studies usually sample typical or unique cases, their advantage being the depth and richness that can be achieved over a long period of time. The advantages of multiple case study design are that cases can be compared to generate a greater depth of analysis. Multiple case study sampling may be carried out in order to test for replication or contradiction [ 8 ]. Given that trials are often conducted over a number of sites, a multiple case study design is more sensible for process evaluations, as there is likely to be variation in implementation between sites. Case definition may occur at a variety of levels but is most appropriate if it reflects the trial design. For example, a case in an individual patient level trial is likely to be defined as a person/patient (e.g. a woman with urinary incontinence—OPAL trial) whereas in a cluster trial, a case is like to be a cluster, such as an organisation (e.g. a general practice—DQIP trial). Of course, the process evaluation could explore cases with less distinct boundaries, such as communities or relationships; however, the clarity with which these cases are defined is important, in order to scope the nature of the data that will be generated.
Carefully sampled cases are critical to a good case study as sampling helps inform the quality of the inferences that can be made from the data [ 53 ]. In both qualitative and quantitative research, how and how many participants to sample must be decided when planning the study. Quantitative sampling techniques generally aim to achieve a random sample. Qualitative research generally uses purposive samples to achieve data saturation, occurring when the incoming data produces little or no new information to address the research questions. The term data saturation has evolved from theoretical saturation in conventional grounded theory studies; however, its relevance to other types of studies is contentious as the term saturation seems to be widely used but poorly justified [ 54 ]. Empirical evidence suggests that for in-depth interview studies, saturation occurs at 12 interviews for thematic saturation, but typically more would be needed for a heterogenous sample higher degrees of saturation [ 55 , 56 ]. Both DQIP and OPAL case studies were huge with OPAL designed to interview each of the 40 individual cases four times and DQIP designed to interview the lead DQIP general practitioner (GP) twice (to capture change over time), another GP and the practice manager from each of the 10 organisational cases. Despite the plethora of mixed methods research textbooks, there is very little about sampling as discussions typically link to method (e.g. interviews) rather than paradigm (e.g. case study).
Purposive sampling can improve the generalisability of the process evaluation by sampling for greater contextual diversity. The typical or average case is often not the richest source of information. Outliers can often reveal more important insights, because they may reflect the implementation of the intervention using different processes. Cases can be selected from a number of criteria, which are not mutually exclusive, to enable a rich and detailed picture to be built across sites [ 53 ]. To avoid the Hawthorne effect, it is recommended that process evaluations sample from both intervention and control sites, which enables comparison and explanation. There is always a trade-off between breadth and depth in sampling, so it is important to note that often quantity does not mean quality and that carefully sampled cases can provide powerful illustrative examples of how the intervention worked in practice, the relationship between the intervention and context and how and why they evolved together. The qualitative components of both DQIP and OPAL process evaluations aimed for maximum variation sampling. Please see Table 1 for further information on how DQIP’s sampling frame was important for providing contextual information on processes influencing effective implementation of the intervention.
Conceptual and theoretical framework
A conceptual or theoretical framework helps to frame data collection and analysis [ 57 ]. Theories can also underpin propositions, which can be tested in the process evaluation. Process evaluations produce intervention-dependent knowledge, and theories help make the research findings more generalizable by providing a common language [ 16 ]. There are a number of mid-range theories which have been designed to be used with process evaluation [ 34 , 35 , 58 ]. The choice of the appropriate conceptual or theoretical framework is, however, dependent on the philosophical and professional background of the research. The two examples within this paper used our own framework for the design of process evaluations, which proposes a number of candidate processes which can be explored, for example, recruitment, delivery, response, maintenance and context [ 45 ]. This framework was published before the MRC guidance on process evaluations, and both the DQIP and OPAL process evaluations were designed before the MRC guidance was published. The DQIP process evaluation explored all candidates in the framework whereas the OPAL process evaluation selected four candidates, illustrating that process evaluations can be selective in what they explore based on the purpose, research questions and resources. Furthermore, as Kislov and colleagues argue, we also have a responsibility to critique the theoretical framework underpinning the evaluation and refine theories to advance knowledge [ 59 ].
An important consideration is what data to collect or measure and when. Case study methodology supports a range of data collection methods, both qualitative and quantitative, to best answer the research questions. As the aim of the case study is to gain an in-depth understanding of phenomena in context, methods are more commonly qualitative or mixed method in nature. Qualitative methods such as interviews, focus groups and observation offer rich descriptions of the setting, delivery of the intervention in each site and arm, how the intervention was perceived by the professionals delivering the intervention and the patients receiving the intervention. Quantitative methods can measure recruitment, fidelity and dose and establish which characteristics are associated with adoption, delivery and effectiveness. To ensure an understanding of the complexity of the relationship between the intervention and context, the case study should rely on multiple sources of data and triangulate these to confirm and corroborate the findings [ 8 ]. Process evaluations might consider using routine data collected in the trial across all sites and additional qualitative data across carefully sampled sites for a more nuanced picture within reasonable resource constraints. Mixed methods allow researchers to ask more complex questions and collect richer data than can be collected by one method alone [ 60 ]. The use of multiple sources of data allows data triangulation, which increases a study’s internal validity but also provides a more in-depth and holistic depiction of the case [ 20 ]. For example, in the DQIP process evaluation, the quantitative component used routinely collected data from all sites participating in the trial and purposively sampled cases for a more in-depth qualitative exploration [ 21 , 38 , 39 ].
The timing of data collection is crucial to study design, especially within a process evaluation where data collection can potentially influence the trial outcome. Process evaluations are generally in parallel or retrospective to the trial. The advantage of a retrospective design is that the evaluation itself is less likely to influence the trial outcome. However, the disadvantages include recall bias, lack of sensitivity to nuances and an inability to iteratively explore the relationship between intervention and outcome as it develops. To capture the dynamic relationship between intervention and context, the process evaluation needs to be parallel and longitudinal to the trial. Longitudinal methodological design is rare, but it is needed to capture the dynamic nature of implementation [ 40 ]. How the intervention is delivered is likely to change over time as it interacts with context. For example, as professionals deliver the intervention, they become more familiar with it, and it becomes more embedded into systems. The OPAL process evaluation was a longitudinal, mixed methods process evaluation where the quantitative component had been predefined and built into trial data collection systems. Data collection in both the qualitative and quantitative components mirrored the trial data collection points, which were longitudinal to capture adherence and contextual changes over time.
There is a lot of attention in the recent literature towards a systems approach to understanding interventions in context, which suggests interventions are ‘events within systems’ [ 61 , 62 ]. This framing highlights the dynamic nature of context, suggesting that interventions are an attempt to change systems dynamics. This conceptualisation would suggest that the study design should collect contextual data before and after implementation to assess the effect of the intervention on the context and vice versa.
Designing a rigorous analysis plan is particularly important for multiple case studies, where researchers must decide whether their approach to analysis is case or variable based. Case-based analysis is the most common, and analytic strategies must be clearly articulated for within and across case analysis. A multiple case study design can consist of multiple cases, where each case is analysed at the case level, or of multiple embedded cases, where data from all the cases are pulled together for analysis at some level. For example, OPAL analysis was at the case level, but all the cases for the intervention and control arms were pulled together at the arm level for more in-depth analysis and comparison. For Yin, analytical strategies rely on theoretical propositions, but for Stake, analysis works from the data to develop theory. In OPAL and DQIP, case summaries were written to summarise the cases and detail within-case analysis. Each of the studies structured these differently based on the phenomena of interest and the analytic technique. DQIP applied an approach more akin to Stake [ 9 ], with the cases summarised around inductive themes whereas OPAL applied a Yin [ 8 ] type approach using theoretical propositions around which the case summaries were structured. As the data for each case had been collected through longitudinal interviews, the case summaries were able to capture changes over time. It is beyond the scope of this paper to discuss different analytic techniques; however, to ensure the holistic examination of the intervention(s) in context, it is important to clearly articulate and demonstrate how data is integrated and synthesised [ 31 ].
There are a number of approaches to process evaluation design in the literature; however, there is a paucity of research on what case study design can offer process evaluations. We argue that case study is one of the best research designs to underpin process evaluations, to capture the dynamic and complex relationship between intervention and context during implementation [ 38 ]. Case study can enable comparisons within and across intervention and control arms and enable the evolving relationship between intervention and context to be captured holistically rather than considering processes in isolation. Utilising a longitudinal design can enable the dynamic relationship between context and intervention to be captured in real time. This information is fundamental to holistically explaining what intervention was implemented, understanding how and why the intervention worked or not and informing the transferability of the intervention into routine clinical practice.
Case study designs are not prescriptive, but process evaluations using case study should consider the purpose, trial design, the theories or assumptions underpinning the intervention, and the conceptual and theoretical frameworks informing the evaluation. We have discussed each of these considerations in turn, providing a comprehensive overview of issues for process evaluations using a case study design. There is no single or best way to conduct a process evaluation or a case study, but researchers need to make informed choices about the process evaluation design. Although this paper focuses on process evaluations, we recognise that case study design could also be useful during intervention development and feasibility trials. Elements of this paper are also applicable to other study designs involving trials.
Availability of data and materials
No data and materials were used.
Data-driven Quality Improvement in Primary Care
Medical Research Council
Nonsteroidal anti-inflammatory drugs
Optimizing Pelvic Floor Muscle Exercises to Achieve Long-term benefits
Blencowe NB. Systematic review of intervention design and delivery in pragmatic and explanatory surgical randomized clinical trials. Br J Surg. 2015;102:1037–47.
Article CAS PubMed Google Scholar
Dixon-Woods M. The problem of context in quality improvement. In: Foundation TH, editor. Perspectives on context: The Health Foundation; 2014.
Wells M, Williams B, Treweek S, Coyle J, Taylor J. Intervention description is not enough: evidence from an in-depth multiple case study on the untold role and impact of context in randomised controlled trials of seven complex interventions. Trials. 2012;13(1):95.
Article PubMed PubMed Central Google Scholar
Grant A, Sullivan F, Dowell J. An ethnographic exploration of influences on prescribing in general practice: why is there variation in prescribing practices? Implement Sci. 2013;8(1):72.
Lang ES, Wyer PC, Haynes RB. Knowledge translation: closing the evidence-to-practice gap. Ann Emerg Med. 2007;49(3):355–63.
Article PubMed Google Scholar
Ward V, House AF, Hamer S. Developing a framework for transferring knowledge into action: a thematic analysis of the literature. J Health Serv Res Policy. 2009;14(3):156–64.
Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med. 2011;104(12):510–20.
Yin R. Case study research and applications: design and methods. Los Angeles: Sage Publications Inc; 2018.
Stake R. The art of case study research. Thousand Oaks, California: Sage Publications Ltd; 1995.
Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, Moore L, O’Cathain A, Tinati T, Wight D, et al. Process evaluation of complex interventions: Medical Research Council guidance. Br Med J. 2015;350.
Hawe P. Minimal, negligible and negligent interventions. Soc Sci Med. 2015;138:265–8.
Moore GF, Evans RE, Hawkins J, Littlecott H, Melendez-Torres GJ, Bonell C, Murphy S. From complex social interventions to interventions in complex social systems: future directions and unresolved questions for intervention development and evaluation. Evaluation. 2018;25(1):23–45.
Greenhalgh T, Papoutsi C. Studying complexity in health services research: desperately seeking an overdue paradigm shift. BMC Med. 2018;16(1):95.
Rutter H, Savona N, Glonti K, Bibby J, Cummins S, Finegood DT, Greaves F, Harper L, Hawe P, Moore L, et al. The need for a complex systems model of evidence for public health. Lancet. 2017;390(10112):2602–4.
Moore G, Cambon L, Michie S, Arwidson P, Ninot G, Ferron C, Potvin L, Kellou N, Charlesworth J, Alla F, et al. Population health intervention research: the place of theories. Trials. 2019;20(1):285.
Kislov R. Engaging with theory: from theoretically informed to theoretically informative improvement research. BMJ Qual Saf. 2019;28(3):177–9.
Boulton R, Sandall J, Sevdalis N. The cultural politics of ‘Implementation Science’. J Med Human. 2020;41(3):379-94. h https://doi.org/10.1007/s10912-020-09607-9 .
Cheng KKF, Metcalfe A. Qualitative methods and process evaluation in clinical trials context: where to head to? Int J Qual Methods. 2018;17(1):1609406918774212.
Article Google Scholar
Richards DA, Bazeley P, Borglin G, Craig P, Emsley R, Frost J, Hill J, Horwood J, Hutchings HA, Jinks C, et al. Integrating quantitative and qualitative data and findings when undertaking randomised controlled trials. BMJ Open. 2019;9(11):e032081.
Thomas G. How to do your case study, 2nd edition edn. London: Sage Publications Ltd; 2016.
Grant A, Dreischulte T, Guthrie B. Process evaluation of the Data-driven Quality Improvement in Primary Care (DQIP) trial: case study evaluation of adoption and maintenance of a complex intervention to reduce high-risk primary care prescribing. BMJ Open. 2017;7(3).
Pfadenhauer L, Rohwer A, Burns J, Booth A, Lysdahl KB, Hofmann B, Gerhardus A, Mozygemba K, Tummers M, Wahlster P, et al. Guidance for the assessment of context and implementation in health technology assessments (HTA) and systematic reviews of complex interventions: the Context and Implementation of Complex Interventions (CICI) framework: Integrate-HTA; 2016.
Bate P, Robert G, Fulop N, Ovretveit J, Dixon-Woods M. Perspectives on context. London: The Health Foundation; 2014.
Ovretveit J. Understanding the conditions for improvement: research to discover which context influences affect improvement success. BMJ Qual Saf. 2011;20.
Medical Research Council: Process evaluation of complex interventions: UK Medical Research Council (MRC) guidance. 2015.
May CR, Johnson M, Finch T. Implementation, context and complexity. Implement Sci. 2016;11(1):141.
Bate P. Context is everything. In: Perpesctives on Context. The Health Foundation 2014.
Horton TJ, Illingworth JH, Warburton WHP. Overcoming challenges in codifying and replicating complex health care interventions. Health Aff. 2018;37(2):191–7.
O'Connor AM, Tugwell P, Wells GA, Elmslie T, Jolly E, Hollingworth G, McPherson R, Bunn H, Graham I, Drake E. A decision aid for women considering hormone therapy after menopause: decision support framework and evaluation. Patient Educ Couns. 1998;33:267–79.
Creswell J, Poth C. Qualiative inquiry and research design, fourth edition edn. Thousan Oaks, California: Sage Publications; 2018.
Carolan CM, Forbat L, Smith A. Developing the DESCARTE model: the design of case study research in health care. Qual Health Res. 2016;26(5):626–39.
Takahashi ARW, Araujo L. Case study research: opening up research opportunities. RAUSP Manage J. 2020;55(1):100–11.
Tight M. Understanding case study research, small-scale research with meaning. London: Sage Publications; 2017.
May C, Finch T. Implementing, embedding, and integrating practices: an outline of normalisation process theory. Sociology. 2009;43:535.
Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice. A consolidated framework for advancing implementation science. Implement Sci. 2009;4.
Pawson R, Tilley N. Realist evaluation. London: Sage; 1997.
Dreischulte T, Donnan P, Grant A, Hapca A, McCowan C, Guthrie B. Safer prescribing - a trial of education, informatics & financial incentives. N Engl J Med. 2016;374:1053–64.
Grant A, Dreischulte T, Guthrie B. Process evaluation of the Data-driven Quality Improvement in Primary Care (DQIP) trial: active and less active ingredients of a multi-component complex intervention to reduce high-risk primary care prescribing. Implement Sci. 2017;12(1):4.
Dreischulte T, Grant A, Hapca A, Guthrie B. Process evaluation of the Data-driven Quality Improvement in Primary Care (DQIP) trial: quantitative examination of variation between practices in recruitment, implementation and effectiveness. BMJ Open. 2018;8(1):e017133.
Grant A, Dean S, Hay-Smith J, Hagen S, McClurg D, Taylor A, Kovandzic M, Bugge C. Effectiveness and cost-effectiveness randomised controlled trial of basic versus biofeedback-mediated intensive pelvic floor muscle training for female stress or mixed urinary incontinence: protocol for the OPAL (Optimising Pelvic Floor Exercises to Achieve Long-term benefits) trial mixed methods longitudinal qualitative case study and process evaluation. BMJ Open. 2019;9(2):e024152.
Hagen S, McClurg D, Bugge C, Hay-Smith J, Dean SG, Elders A, Glazener C, Abdel-fattah M, Agur WI, Booth J, et al. Effectiveness and cost-effectiveness of basic versus biofeedback-mediated intensive pelvic floor muscle training for female stress or mixed urinary incontinence: protocol for the OPAL randomised trial. BMJ Open. 2019;9(2):e024153.
Steckler A, Linnan L. Process evaluation for public health interventions and research; 2002.
Durlak JA. Why programme implementation is so important. J Prev Intervent Commun. 1998;17(2):5–18.
Bonell C, Oakley A, Hargreaves J, VS, Rees R. Assessment of generalisability in trials of health interventions: suggested framework and systematic review. Br Med J. 2006;333(7563):346–9.
Article CAS Google Scholar
Grant A, Treweek S, Dreischulte T, Foy R, Guthrie B. Process evaluations for cluster-randomised trials of complex interventions: a proposed framework for design and reporting. Trials. 2013;14(1):15.
Yin R. Case study research: design and methods. London: Sage Publications; 2003.
Bugge C, Hay-Smith J, Grant A, Taylor A, Hagen S, McClurg D, Dean S: A 24 month longitudinal qualitative study of women’s experience of electromyography biofeedback pelvic floor muscle training (PFMT) and PFMT alone for urinary incontinence: adherence, outcome and context. ICS Gothenburg 2019 2019. https://www.ics.org/2019/abstract/473 . Access 10.9.2020.
Suzanne Hagen, Andrew Elders, Susan Stratton, Nicole Sergenson, Carol Bugge, Sarah Dean, Jean Hay-Smith, Mary Kilonzo, Maria Dimitrova, Mohamed Abdel-Fattah, Wael Agur, Jo Booth, Cathryn Glazener, Karen Guerrero, Alison McDonald, John Norrie, Louise R Williams, Doreen McClurg. Effectiveness of pelvic floor muscle training with and without electromyographic biofeedback for urinary incontinence in women: multicentre randomised controlled trial BMJ 2020;371. https://doi.org/10.1136/bmj.m3719 .
Cook TD. Emergent principles for the design, implementation, and analysis of cluster-based experiments in social science. Ann Am Acad Pol Soc Sci. 2005;599(1):176–98.
Hoffmann T, Glasziou P, Boutron I, Milne R, Perera R, Moher D. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. Br Med J. 2014;348.
Hawe P, Shiell A, Riley T. Complex interventions: how “out of control” can a randomised controlled trial be? Br Med J. 2004;328(7455):1561–3.
Grant A, Dreischulte T, Treweek S, Guthrie B. Study protocol of a mixed-methods evaluation of a cluster randomised trial to improve the safety of NSAID and antiplatelet prescribing: Data-driven Quality Improvement in Primary Care. Trials. 2012;13:154.
Flyvbjerg B. Five misunderstandings about case-study research. Qual Inq. 2006;12(2):219–45.
Thorne S. The great saturation debate: what the “S word” means and doesn’t mean in qualitative research reporting. Can J Nurs Res. 2020;52(1):3–5.
Guest G, Bunce A, Johnson L. How many interviews are enough?: an experiment with data saturation and variability. Field Methods. 2006;18(1):59–82.
Guest G, Namey E, Chen M. A simple method to assess and report thematic saturation in qualitative research. PLoS One. 2020;15(5):e0232076.
Article CAS PubMed PubMed Central Google Scholar
Davidoff F, Dixon-Woods M, Leviton L, Michie S. Demystifying theory and its use in improvement. BMJ Qual Saf. 2015;24(3):228–38.
Rycroft-Malone J. The PARIHS framework: a framework for guiding the implementation of evidence-based practice. J Nurs Care Qual. 2004;4:297-304.
Kislov R, Pope C, Martin GP, Wilson PM. Harnessing the power of theorising in implementation science. Implement Sci. 2019;14(1):103.
Cresswell JW, Plano Clark VL. Designing and conducting mixed methods research. Thousand Oaks: Sage Publications Ltd; 2007.
Hawe P, Shiell A, Riley T. Theorising interventions as events in systems. Am J Community Psychol. 2009;43:267–76.
Craig P, Ruggiero E, Frohlich KL, Mykhalovskiy E, White M. Taking account of context in population health intervention research: guidance for producers, users and funders of research: National Institute for Health Research; 2018. https://www.ncbi.nlm.nih.gov/books/NBK498645/pdf/Bookshelf_NBK498645.pdf .
We would like to thank Professor Shaun Treweek for the discussions about context in trials.
No funding was received for this work.
Authors and affiliations.
School of Nursing, Midwifery and Paramedic Practice, Robert Gordon University, Garthdee Road, Aberdeen, AB10 7QB, UK
Faculty of Health Sciences and Sport, University of Stirling, Pathfoot Building, Stirling, FK9 4LA, UK
Department of Surgery and Cancer, Imperial College London, Charing Cross Campus, London, W6 8RP, UK
You can also search for this author in PubMed Google Scholar
AG, CB and MW conceptualised the study. AG wrote the paper. CB and MW commented on the drafts. All authors have approved the final manuscript.
Correspondence to Aileen Grant .
Ethics approval and consent to participate.
Ethics approval and consent to participate is not appropriate as no participants were included.
Consent for publication
Consent for publication is not required as no participants were included.
The authors declare no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
Reprints and Permissions
About this article
Cite this article.
Grant, A., Bugge, C. & Wells, M. Designing process evaluations using case study to explore the context of complex interventions evaluated in trials. Trials 21 , 982 (2020). https://doi.org/10.1186/s13063-020-04880-4
Received : 09 April 2020
Accepted : 06 November 2020
Published : 27 November 2020
DOI : https://doi.org/10.1186/s13063-020-04880-4
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Process evaluation
- Case study design
- Submission enquiries: Access here and click Contact Us
- General enquiries: [email protected]
- AI Use Cases
- Blockchain Use Cases
- Conversational AI
- Data Cleaning
- Data Collection
- Digital Transformation
- Quantum Computing
- Process Mining
- Robotic Process Automation (RPA)
- Synthetic Data
- Recommendation Engines
- Proxy Services
- Shortlist Vendors
- Claim Your Solution
- Identify Top Channels in Your Domain
55 Process Improvement Case Studies & Project Results 
What are the typical project results, further reading.
Business leaders know that process improvement reduces costs and increases customer satisfaction. Therefore, businesses follow process improvement methodologies or deploy tools such as process modeling, process mining and RPA to discover, modify and automate their processes. However, it can be difficult for process experts and business analysts to understand the different process improvement approaches and the results they should expect.
Read our process improvement approaches guide for a categorization of process improvement approaches so you can rely on a framework to structure your process improvement initiatives. In this article, we share typical process improvement project results and case studies. Our aim is to provide benchmarks so business analysts and leaders can set targets for their own initiatives.
Process improvement solutions help businesses define weaknesses and take action to solve these problems. In the case studies we collected, the most common project results that we came across are as follows:
1- Improved efficiency: Most businesses increase the efficiency of their processes by adapting process improvement methodologies. After defining their problems, companies eliminate unnecessary steps in processes, reduce their costs, and shorten process times. As a result, they achieve faster processes and higher quality output with fewer resources.
For example, in a process mining case study, a manufacturer leveraged IBM Process Mining to analyze the procure-to-pay processes. It is claimed that the manufacturer detected and managed deviations, mismatches and early payments, which lessened maverick buying and saved $60,000 in reworking cost. The firm improved purchase order and invoice processes by automating 75% of line creation and delivery activities. As a result, the company decreased the invoice registration and approval time.
2- Enhanced customer satisfaction: The increasing quality of output and faster processes can also reflect on customer satisfaction. Process improvement solutions help businesses reduce waiting time and focus on customer value. For example, it is claimed that by adopting the Kaizen methodology, Tata Steel has shortened its response time and delivered more on-time orders to its customers.
3- Harmonization of different teams/processes: For large companies, handling different processes simultaneously can be a big challenge. Teams should be informed about what others do, and processes need to work in sync to avoid problems. With process improvement solutions, businesses can have a full understanding of all their companies and align different processes successfully.
Here is an extended list of case studies which are collected from different resources. You can filter the list by the process improvement solution, service provider, industry, or process and investigate the achieved results.
If you want to learn more on process improvement, these articles can also interest you:
- Process Improvement: In-depth Guide for Businesses
- Lean Process Improvement Guide for Your Business
Check out comprehensive and constantly updated list of process mining case studies to process mining real-life examples and compare them to process improvement case studies.
If you still have questions about process improvement, we would like to help:
Cem has been the principal analyst at AIMultiple since 2017. AIMultiple informs hundreds of thousands of businesses (as per similarWeb) including 60% of Fortune 500 every month. Cem's work has been cited by leading global publications including Business Insider , Forbes, Washington Post , global firms like Deloitte , HPE, NGOs like World Economic Forum and supranational organizations like European Commission . You can see more reputable companies and media that referenced AIMultiple. Throughout his career, Cem served as a tech consultant, tech buyer and tech entrepreneur. He advised enterprises on their technology decisions at McKinsey & Company and Altman Solon for more than a decade. He also published a McKinsey report on digitalization. He led technology strategy and procurement of a telco while reporting to the CEO. He has also led commercial growth of deep tech company Hypatos that reached a 7 digit annual recurring revenue and a 9 digit valuation from 0 within 2 years. Cem's work in Hypatos was covered by leading technology publications like TechCrunch and Business Insider . Cem regularly speaks at international technology conferences. He graduated from Bogazici University as a computer engineer and holds an MBA from Columbia Business School.
Compare 20+ Process Intelligence Software in 2023
18 Best Process Analysis Tools & Techniques in 2023
Process Orchestration in 2023: Top 5 Tools & Best Practices
Hi Cem, Thank you ver much for your interesting article. I am interested in getting a deeper look into some of the case studies: How exactly did they approach the problem? ….Would it be possible to get a closer look at the case studies? Thanks in advance. Adrian
Hi Adrian, please feel free to get in touch with us via [email protected] . Happy to discuss these in more detail once we know which types of case studies you are interested in
There’s a reason why our customers love us.
[ well, lots of reasons. ].
- Case Studies
Browse our case studies for in-depth business process automation examples and use cases
Power through process orchestration: How TotalEnergies Belgium revolutionized its customer acquisition processes with Camunda
The Power of Process Orchestration: How SV Group launched an industry-disrupting platform in 6 months with Camunda
High-Volume Real-Time Processes for Fitness Industry Leader
Supporting enterprise-wide business process automation initiatives with Camunda
Learn how Deutsche Telekom transformed from waterfall development on a monolithic application to an agile, microservices-based architecture.
From a complex, monolithic environment to a streamlined microservice-oriented architecture with Camunda
Months to minutes: VA speeds benefits services with automation
Greater operational efficiency and improved end-customer experience for insurers
Execution of all online orders world-wide
Transforming the Customer Experience in Telecommunications Through Order Management Automation
Building Trust and Reducing Time-Intensive Tasks with Camunda
Introduction of a new capacity management system
Flexible, lightweight process automation in only 48 hours
Camunda enables Process as a Service at R+V Versicherung
Managing intellectual property rights in Switzerland by automating core processes
Definition and automation of processes for the automotive industry to enable faster lending processes
Improved clinical safety thanks to Camunda
Boosting Agility, Transparency, and Scalability with Camunda
Improved Employee Experience thanks to Process Orchestration with Camunda
Improving the Customer Experience through Process Automation
Automating Land Titling and Registration with Camunda
Replacement of a document-based business process
Digitizing operational core processes to increase efficiency and save costs
Bringing new products to market 4x faster with workflow-enabled agility
Schedule a customized demo
- Supply Chains
- Warehouse Operations
- Rail Logistics
- Ports & Terminals
- Road Traffic
- Passenger Terminals
- Business Processes
- Asset Management
- Social Processes
- Why simulation
- AnyLogic Timeline
- Artificial Intelligence
- Digital Twin
- Get started
- White papers
- Training and events
- AnyLogic Conference
- For academia
- Academic articles
- Clients & Testimonials
- --> Supply Chains
- --> Manufacturing
- --> Transportation
- --> Warehouse Operations
- --> Rail Logistics
- --> Oil & Gas
- --> Ports & Terminals
- --> Road Traffic
- --> Passenger Terminals
- --> Healthcare
- --> Business Processes
- --> Asset Management
- --> Marketing
- --> Social Processes
- --> Defense
Optimizing New Hyperscale Data Centers Designs for Rack Delivery Workflow
Meta needed to properly design their new Hyperscale Data Centers in order to improve workflows, as well as analyze bottlenecks and throughput capability. In particular, they were concerned with how to optimize the receiving of racks. Using simulation, optimization, and a Monte Carlo experiment they were able to identify the exact requirements necessary to ensure this rack delivery workflow was carried out efficiently and effectively.
ATOM: Digital Twin of Siemens Gas Turbine Fleet Operations
The Agent-based Turbine Operations & Maintenance (ATOM) model is a digital twin simulation model developed by decisionLab Ltd and Siemens. The digital twin emulates the global maintenance repair and overhaul (MRO) operations of Siemens’ aero-derivative gas turbine division. Driven by live data already available within the supply chain, the model provides the capability to use sophisticated simulation and data-analytics methodologies to optimize the fleet operations of Siemens, enabling better data-driven decision-making to improve productivity and efficiency in customer operations and asset...
Demand and Supply Planning for a Large Fast Food Chain
HAVI is a $5 billion global company and McDonald’s long-time supply chain and packaging partner. They provide services for supply chain management, packaging, logistics, and recycling and waste. When McDonald's wanted to build on the success of All-Day Breakfast and introduce additional menu items at 14,000 restaurants, they chose HAVI to complete the project.
Creating Optimal Store Layouts and Improving Labor Scheduling
Domino’s is one of the leading pizza companies in the world. It has been experiencing huge growth and wanted to make the design of new stores simpler and more efficient, as well as improve labor scheduling. In order to do this, they decided to focus on building digital twins of their stores to test new concepts, store processes, and ideal layouts before they were implemented in the actual physical layouts of a store.
Analysis of Management Strategies for the Aircraft Production Ramp-up
The Airbus Group joined the European Union ARUM (Adaptive Production Management) project, which is focused on creating an IT solution for risk reduction, decision-making, and planning during new product ramp-ups. The project is aimed mainly at aircraft and shipbuilding industries. Simulation was chosen as a part of the ARUM solution, because it would allow the participants to reproduce the real production facility experience.
Call Center Optimization and Investment Planning Using Simulation Modeling
The world’s largest companies use data analytics to increase their revenue and keep up with the changing business world. But how does data science relate to simulation modeling, and what are the cases for the implementation of this interaction, primarily concerning value for the business? The United Services Automobile Association (USAA), a Fortune 500 group of companies, has answered these questions with real-life solutions.
A Digital Twin for Energy Efficient and Sustainable Districts
The European Institute for Energy Research constructed a digital twin that can be compared with the real world. The simulation investigated how the proportion of locally generated and used energy can be increased by intelligently controlling the generation and consumption of electricity and heat.
Using simulation modeling for efficient drone application in agriculture
An Indian consulting company BlueKei Solutions simulated, with AnyLogic software, the deployment in the agricultural field of drones for their client. As a result, the simulation showed a few insights that could avoid the extra investment in drones and better planning of the drone operations.
Improving Customer Satisfaction by Checkout Process Optimization of a Retailer’s Stores
A leading food retailer in Portugal, Sonae MC, wanted to improve service quality and customer satisfaction without increasing operating costs. The AnyLogic model, developed by consultancy company LTP, provided the retailer with analytical support regarding the checkout dimensioning process.
Optimizing Trading Business Process with AnyLogic Strategic Planning Software
Fannie Mae (The Federal National Mortgage Association) is a US government-sponsored enterprise that operates in the secondary mortgage market. It purchases and guarantees mortgages from lenders, such as banks and other financial institutions, securitizes them, and then sells them back into the secondary mortgage market as mortgage-backed securities. This provides market liquidity and helps stimulate activity. In 2018, Fannie Mae was ranked 21st on the Fortune 500.
Get a brochure with industry case studies
- Leadership Team
- Our Approach
- PMO Services
- Program & Project Managers
- PM Training & Development
- Case Studies
- Our Culture
- Employee Benefits
- Job Opportunities
- Benefits Realization Management Diagnostic
- Project Management Maturity Guide
- Guide to Agile Project Management
- PM Maturity Assessment
- Project Management as a Service (PMaaS) and Why It’s the Future
- Project Management as a Service (PMaaS)
About PM Solutions
PM Solutions is a project management consulting firm that helps PMO, project, and business leaders apply project and portfolio management practices that drive performance and operational efficiency.
- Co-Founder & Co-CEO J. Kent Crawford
- Co-Founder & Co-CEO Deborah Bigelow Crawford
- President, PM Solutions & PM College Bruce Miller
- Vice President, Client Success, Eric Foss
- Managing Director, HR & Administration, Karen Alfonsi
- Director, Marketing and Communications, Carrie Capili
With our approach , companies can expect high-value, high-impact solutions, and measurable, sustainable results.
- PMO Deployment, Operation, and Enhancement
- Project Review & Recovery
- Project Portfolio Management (PPM)
- Project Management Maturity Advancement
- Organizational Change Management
- Project Management Methodology Implementation
- Demand Management
- Project Management Mentors
- Resource Management
- Vendor Management
Project & Program Managers
We can provide you with highly experienced program and project managers ; experts to help guide, lead, and support high-visibility initiatives.
PM Training & Development
PM College® provides corporate project management training and competency programs for clients around the world.
By Project Initiatives
- Cost Reduction Initiatives (1)
- Data Center Consolidation (1)
- High-risk Capital Initiatives (1)
- Infrastructure Program Management (0)
- Manufacturing Facility Operations (1)
- Mentoring (11)
- Methodology (4)
- New Product Development (1)
- Organizational Change (6)
- PMO Assessment (4)
- PMO Deployment (4)
- Process Improvement (7)
- Program & Portfolio Management (10)
- Project Audits (1)
- Project Management Training (7)
- Regulatory Compliance (1)
- Resource Management (1)
- Strategy Execution (1)
- Systems Integration Deployment (0)
- Troubled Project Recovery (3)
- Vendor Management (2)
- Automotive (1)
- Energy & Utilities (7)
- Financial Services (2)
- Human Resources (1)
- Information Technology (5)
- Insurance (5)
- Manufacturing (6)
- Pharma/Biotech (2)
- Professional Services (2)
- Research and Development (1)
- Retail & Merchandise (1)
- Security (1)
- Benefits Realization (4)
- Change Management (6)
- IT Project Management (4)
- Outsourcing Project Management (4)
- Performance & Value Measurement (12)
- Project Management Maturity (23)
- Project Management Methodology (12)
- Project Management Office (58)
- Project Management Training (31)
- Project Management Trends (50)
- Project Manager Competency (18)
- Project Portfolio Management (11)
- Project Recovery (9)
- Resource Management (5)
- Strategy & Governance (14)
- Articles (46)
- Brochures (3)
- eNewsletters (19)
- Research (42)
- Webinars (24)
- White Papers (34)
- contact us get in touch call: 800.983.0388
Home » Case Studies » Process Improvement
Pm solutions has a proven experience in providing solutions to a broad range of markets. our project management case studies cover a wide variety of needs across a number of industries., more process improvement case studies, ontario power generation achieves project excellence through the establishment of an enterprise project management office.
"We have really changed public perception, and as a result, increased our credibility in OPG’s ability to execute projects. “- Mike Martelli, Chief Projects Officer, OPG Read More »
- Energy & Utilities
- PMO Assessment
- Process Improvement
- Program & Portfolio Management
PM Governance Combined with Agile Tools Improves Delivery and Quality of Financial Services Programs
Changing state and federal regulatory compliance challenges caused this company to reinvent its custom-built storefront and home office systems. The IT and PMO teams were geared more for operational maintenance rather than for the complexities of developing new systems. This resulted in an overwhelming workload and schedule overrun measured in years. A years-overdue project was brought to Operational Pilot, saving millions in development time. Read More »
- Financial Services
Transformative Leadership Sets Financial Services Enterprise PMO on Fast Track to Strategic Value
For this privately held financial services company, a compelling journey of business transformation started with a good read. Read More »
- PMO Deployment
- Project Management Training
- Strategy Execution
Pharmaceutical Company’s Internal PM Certification Program Brings More Predictable Project Outcomes
An internal PM certification program was rapidly developed and implemented. During the first three years, over 400 participants attended more than 30 courses. Team members, project managers, and sponsors gained a good understanding of project management and the techniques needed to achieve more consistent project delivery. The program’s ability to remain intact after an acquisition validated the program’s value to the larger company globally. Read More »
Accident Fund’s Award-Winning PMO Puts Processes Before Tools in Implementing Project Portfolio Management
PM Solutions used its PPM Maturity and Project Management Maturity Models to develop an improvement roadmap that included a number of specific recommendations from a project management governance perspective. Established PPM processes that helped eliminate approximately 100 non-value-adding projects, led to the successful implementation of a PPM tool, and won the company national acclaim – all in under three years. Read More »
Mentoring Brings an Enhanced Focus on Accountability to Merchants Insurance Group
PM Solutions led a targeted mentoring program for the client's project leads to provide one-on-one advice and coaching on how to apply consistent project management practices within the confines of their actual projects. Value Delivered: Improved on-time project delivery to 80% resulting in one program booking $100,000 of business in the first three days, and another program realizing 758% revenue growth in the first two months of introduction. Read More »
Wireless Warehouse Initiative Transforms Materials Management Operations and Improves Delivery Speed by 66%
For a world-wide provider of turbine-driven gas compression, oil pumping and power generation packages: PM Solutions provided a senior-level project manager to lead the implementation of a warehouse radio frequency (RF) solution, being responsible for overseeing requirements gathering, systems design, procurement and installation of the necessary hardware and software, and coordinating user acceptance testing. Value Delivered: Thanks to a retooling of the materials management system incorporating RF automation, the company’s warehouse accelerated delivery speed of parts by 66%, creating a significant reduction in overall time to market. This complex automation project came in on time and more than 18% under budget. Read More »
- Information Technology
- White Papers
- PMO of the Year Award
Connect with Us
The project management experts®.
PM Solutions is a project management services firm helping organizations apply project management and PMO practices to improve business performance.
Toll-free (US): 800.983.0388 International: +1.484.450.0100 [email protected]
285 Wilmington-West Chester Pike Chadds Ford, PA 19317 USA
© 2012-2023 Project Management Solutions, Inc. All Rights Reserved. Privacy & Terms
Process Flow Case Study Answers
This article walks through the answers for the process flow case study.
See case study problem here: Creating Process Flow Diagrams – A Case Study
Pixel-Tech In Store Pickup
Current State Process Flow Diagram
This is the existing process flow as it is being done today, also referred to as the As-Is process.
Future State Process Flow Diagram
This is the process flow for the solution you would propose to solve the problem, also referred to as the To-Be process.
Tech Support Process
This is the current state process for providing technical support, also referred to As-Is process.
Tech Support To Be Process
This is the future state of the technical support process which includes covid safe ways to pickup machines from employees.
- Is there anything else you would add to this work flow?
- How would you present these changes to your team?
From Writing Requirements to Writing User Stories
How to add releases to jira - how to use fix versions in jira: business analyst training, leave a reply cancel reply.
Your email address will not be published.