collaborative problem solving process

Collaborative Problem Solving: What It Is and How to Do It

What is collaborative problem solving, how to solve problems as a team, celebrating success as a team.

Problems arise. That's a well-known fact of life and business. When they do, it may seem more straightforward to take individual ownership of the problem and immediately run with trying to solve it. However, the most effective problem-solving solutions often come through collaborative problem solving.

As defined by Webster's Dictionary , the word collaborate is to work jointly with others or together, especially in an intellectual endeavor. Therefore, collaborative problem solving (CPS) is essentially solving problems by working together as a team. While problems can and are solved individually, CPS often brings about the best resolution to a problem while also developing a team atmosphere and encouraging creative thinking.

Because collaborative problem solving involves multiple people and ideas, there are some techniques that can help you stay on track, engage efficiently, and communicate effectively during collaboration.

  • Set Expectations. From the very beginning, expectations for openness and respect must be established for CPS to be effective. Everyone participating should feel that their ideas will be heard and valued.
  • Provide Variety. Another way of providing variety can be by eliciting individuals outside the organization but affected by the problem. This may mean involving various levels of leadership from the ground floor to the top of the organization. It may be that you involve someone from bookkeeping in a marketing problem-solving session. A perspective from someone not involved in the day-to-day of the problem can often provide valuable insight.
  • Communicate Clearly.  If the problem is not well-defined, the solution can't be. By clearly defining the problem, the framework for collaborative problem solving is narrowed and more effective.
  • Expand the Possibilities.  Think beyond what is offered. Take a discarded idea and expand upon it. Turn it upside down and inside out. What is good about it? What needs improvement? Sometimes the best ideas are those that have been discarded rather than reworked.
  • Encourage Creativity.  Out-of-the-box thinking is one of the great benefits of collaborative problem-solving. This may mean that solutions are proposed that have no way of working, but a small nugget makes its way from that creative thought to evolution into the perfect solution.
  • Provide Positive Feedback. There are many reasons participants may hold back in a collaborative problem-solving meeting. Fear of performance evaluation, lack of confidence, lack of clarity, and hierarchy concerns are just a few of the reasons people may not initially participate in a meeting. Positive public feedback early on in the meeting will eliminate some of these concerns and create more participation and more possible solutions.
  • Consider Solutions. Once several possible ideas have been identified, discuss the advantages and drawbacks of each one until a consensus is made.
  • Assign Tasks.  A problem identified and a solution selected is not a problem solved. Once a solution is determined, assign tasks to work towards a resolution. A team that has been invested in the creation of the solution will be invested in its resolution. The best time to act is now.
  • Evaluate the Solution. Reconnect as a team once the solution is implemented and the problem is solved. What went well? What didn't? Why? Collaboration doesn't necessarily end when the problem is solved. The solution to the problem is often the next step towards a new collaboration.

The burden that is lifted when a problem is solved is enough victory for some. However, a team that plays together should celebrate together. It's not only collaboration that brings unity to a team. It's also the combined celebration of a unified victory—the moment you look around and realize the collectiveness of your success.

We can help

Check out MindManager to learn more about how you can ignite teamwork and innovation by providing a clearer perspective on the big picture with a suite of sharing options and collaborative tools.

Need to Download MindManager?

Try the full version of mindmanager free for 30 days.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 11 January 2023

The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature

  • Enwei Xu   ORCID: orcid.org/0000-0001-6424-8169 1 ,
  • Wei Wang 1 &
  • Qingxia Wang 1  

Humanities and Social Sciences Communications volume  10 , Article number:  16 ( 2023 ) Cite this article

10k Accesses

8 Citations

3 Altmetric

Metrics details

  • Science, technology and society

Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field of education as well as a key competence for learners in the 21st century. However, the effectiveness of collaborative problem-solving in promoting students’ critical thinking remains uncertain. This current research presents the major findings of a meta-analysis of 36 pieces of the literature revealed in worldwide educational periodicals during the 21st century to identify the effectiveness of collaborative problem-solving in promoting students’ critical thinking and to determine, based on evidence, whether and to what extent collaborative problem solving can result in a rise or decrease in critical thinking. The findings show that (1) collaborative problem solving is an effective teaching approach to foster students’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]); (2) in respect to the dimensions of critical thinking, collaborative problem solving can significantly and successfully enhance students’ attitudinal tendencies (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI[0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI[0.58, 0.82]); and (3) the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have an impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. On the basis of these results, recommendations are made for further study and instruction to better support students’ critical thinking in the context of collaborative problem-solving.

Introduction

Although critical thinking has a long history in research, the concept of critical thinking, which is regarded as an essential competence for learners in the 21st century, has recently attracted more attention from researchers and teaching practitioners (National Research Council, 2012 ). Critical thinking should be the core of curriculum reform based on key competencies in the field of education (Peng and Deng, 2017 ) because students with critical thinking can not only understand the meaning of knowledge but also effectively solve practical problems in real life even after knowledge is forgotten (Kek and Huijser, 2011 ). The definition of critical thinking is not universal (Ennis, 1989 ; Castle, 2009 ; Niu et al., 2013 ). In general, the definition of critical thinking is a self-aware and self-regulated thought process (Facione, 1990 ; Niu et al., 2013 ). It refers to the cognitive skills needed to interpret, analyze, synthesize, reason, and evaluate information as well as the attitudinal tendency to apply these abilities (Halpern, 2001 ). The view that critical thinking can be taught and learned through curriculum teaching has been widely supported by many researchers (e.g., Kuncel, 2011 ; Leng and Lu, 2020 ), leading to educators’ efforts to foster it among students. In the field of teaching practice, there are three types of courses for teaching critical thinking (Ennis, 1989 ). The first is an independent curriculum in which critical thinking is taught and cultivated without involving the knowledge of specific disciplines; the second is an integrated curriculum in which critical thinking is integrated into the teaching of other disciplines as a clear teaching goal; and the third is a mixed curriculum in which critical thinking is taught in parallel to the teaching of other disciplines for mixed teaching training. Furthermore, numerous measuring tools have been developed by researchers and educators to measure critical thinking in the context of teaching practice. These include standardized measurement tools, such as WGCTA, CCTST, CCTT, and CCTDI, which have been verified by repeated experiments and are considered effective and reliable by international scholars (Facione and Facione, 1992 ). In short, descriptions of critical thinking, including its two dimensions of attitudinal tendency and cognitive skills, different types of teaching courses, and standardized measurement tools provide a complex normative framework for understanding, teaching, and evaluating critical thinking.

Cultivating critical thinking in curriculum teaching can start with a problem, and one of the most popular critical thinking instructional approaches is problem-based learning (Liu et al., 2020 ). Duch et al. ( 2001 ) noted that problem-based learning in group collaboration is progressive active learning, which can improve students’ critical thinking and problem-solving skills. Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses problems with poor structure in real-world situations as the starting point for the learning process (Liang et al., 2017 ). Students learn the knowledge needed to solve problems in a collaborative group, reach a consensus on problems in the field, and form solutions through social cooperation methods, such as dialogue, interpretation, questioning, debate, negotiation, and reflection, thus promoting the development of learners’ domain knowledge and critical thinking (Cindy, 2004 ; Liang et al., 2017 ).

Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of the empirical literature on critical thinking from various perspectives. However, little attention has been paid to the impact of collaborative problem-solving on critical thinking. Therefore, the best approach for developing and enhancing critical thinking throughout collaborative problem-solving is to examine how to implement critical thinking instruction; however, this issue is still unexplored, which means that many teachers are incapable of better instructing critical thinking (Leng and Lu, 2020 ; Niu et al., 2013 ). For example, Huber ( 2016 ) provided the meta-analysis findings of 71 publications on gaining critical thinking over various time frames in college with the aim of determining whether critical thinking was truly teachable. These authors found that learners significantly improve their critical thinking while in college and that critical thinking differs with factors such as teaching strategies, intervention duration, subject area, and teaching type. The usefulness of collaborative problem-solving in fostering students’ critical thinking, however, was not determined by this study, nor did it reveal whether there existed significant variations among the different elements. A meta-analysis of 31 pieces of educational literature was conducted by Liu et al. ( 2020 ) to assess the impact of problem-solving on college students’ critical thinking. These authors found that problem-solving could promote the development of critical thinking among college students and proposed establishing a reasonable group structure for problem-solving in a follow-up study to improve students’ critical thinking. Additionally, previous empirical studies have reached inconclusive and even contradictory conclusions about whether and to what extent collaborative problem-solving increases or decreases critical thinking levels. As an illustration, Yang et al. ( 2008 ) carried out an experiment on the integrated curriculum teaching of college students based on a web bulletin board with the goal of fostering participants’ critical thinking in the context of collaborative problem-solving. These authors’ research revealed that through sharing, debating, examining, and reflecting on various experiences and ideas, collaborative problem-solving can considerably enhance students’ critical thinking in real-life problem situations. In contrast, collaborative problem-solving had a positive impact on learners’ interaction and could improve learning interest and motivation but could not significantly improve students’ critical thinking when compared to traditional classroom teaching, according to research by Naber and Wyatt ( 2014 ) and Sendag and Odabasi ( 2009 ) on undergraduate and high school students, respectively.

The above studies show that there is inconsistency regarding the effectiveness of collaborative problem-solving in promoting students’ critical thinking. Therefore, it is essential to conduct a thorough and trustworthy review to detect and decide whether and to what degree collaborative problem-solving can result in a rise or decrease in critical thinking. Meta-analysis is a quantitative analysis approach that is utilized to examine quantitative data from various separate studies that are all focused on the same research topic. This approach characterizes the effectiveness of its impact by averaging the effect sizes of numerous qualitative studies in an effort to reduce the uncertainty brought on by independent research and produce more conclusive findings (Lipsey and Wilson, 2001 ).

This paper used a meta-analytic approach and carried out a meta-analysis to examine the effectiveness of collaborative problem-solving in promoting students’ critical thinking in order to make a contribution to both research and practice. The following research questions were addressed by this meta-analysis:

What is the overall effect size of collaborative problem-solving in promoting students’ critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills)?

How are the disparities between the study conclusions impacted by various moderating variables if the impacts of various experimental designs in the included studies are heterogeneous?

This research followed the strict procedures (e.g., database searching, identification, screening, eligibility, merging, duplicate removal, and analysis of included studies) of Cooper’s ( 2010 ) proposed meta-analysis approach for examining quantitative data from various separate studies that are all focused on the same research topic. The relevant empirical research that appeared in worldwide educational periodicals within the 21st century was subjected to this meta-analysis using Rev-Man 5.4. The consistency of the data extracted separately by two researchers was tested using Cohen’s kappa coefficient, and a publication bias test and a heterogeneity test were run on the sample data to ascertain the quality of this meta-analysis.

Data sources and search strategies

There were three stages to the data collection process for this meta-analysis, as shown in Fig. 1 , which shows the number of articles included and eliminated during the selection process based on the statement and study eligibility criteria.

figure 1

This flowchart shows the number of records identified, included and excluded in the article.

First, the databases used to systematically search for relevant articles were the journal papers of the Web of Science Core Collection and the Chinese Core source journal, as well as the Chinese Social Science Citation Index (CSSCI) source journal papers included in CNKI. These databases were selected because they are credible platforms that are sources of scholarly and peer-reviewed information with advanced search tools and contain literature relevant to the subject of our topic from reliable researchers and experts. The search string with the Boolean operator used in the Web of Science was “TS = (((“critical thinking” or “ct” and “pretest” or “posttest”) or (“critical thinking” or “ct” and “control group” or “quasi experiment” or “experiment”)) and (“collaboration” or “collaborative learning” or “CSCL”) and (“problem solving” or “problem-based learning” or “PBL”))”. The research area was “Education Educational Research”, and the search period was “January 1, 2000, to December 30, 2021”. A total of 412 papers were obtained. The search string with the Boolean operator used in the CNKI was “SU = (‘critical thinking’*‘collaboration’ + ‘critical thinking’*‘collaborative learning’ + ‘critical thinking’*‘CSCL’ + ‘critical thinking’*‘problem solving’ + ‘critical thinking’*‘problem-based learning’ + ‘critical thinking’*‘PBL’ + ‘critical thinking’*‘problem oriented’) AND FT = (‘experiment’ + ‘quasi experiment’ + ‘pretest’ + ‘posttest’ + ‘empirical study’)” (translated into Chinese when searching). A total of 56 studies were found throughout the search period of “January 2000 to December 2021”. From the databases, all duplicates and retractions were eliminated before exporting the references into Endnote, a program for managing bibliographic references. In all, 466 studies were found.

Second, the studies that matched the inclusion and exclusion criteria for the meta-analysis were chosen by two researchers after they had reviewed the abstracts and titles of the gathered articles, yielding a total of 126 studies.

Third, two researchers thoroughly reviewed each included article’s whole text in accordance with the inclusion and exclusion criteria. Meanwhile, a snowball search was performed using the references and citations of the included articles to ensure complete coverage of the articles. Ultimately, 36 articles were kept.

Two researchers worked together to carry out this entire process, and a consensus rate of almost 94.7% was reached after discussion and negotiation to clarify any emerging differences.

Eligibility criteria

Since not all the retrieved studies matched the criteria for this meta-analysis, eligibility criteria for both inclusion and exclusion were developed as follows:

The publication language of the included studies was limited to English and Chinese, and the full text could be obtained. Articles that did not meet the publication language and articles not published between 2000 and 2021 were excluded.

The research design of the included studies must be empirical and quantitative studies that can assess the effect of collaborative problem-solving on the development of critical thinking. Articles that could not identify the causal mechanisms by which collaborative problem-solving affects critical thinking, such as review articles and theoretical articles, were excluded.

The research method of the included studies must feature a randomized control experiment or a quasi-experiment, or a natural experiment, which have a higher degree of internal validity with strong experimental designs and can all plausibly provide evidence that critical thinking and collaborative problem-solving are causally related. Articles with non-experimental research methods, such as purely correlational or observational studies, were excluded.

The participants of the included studies were only students in school, including K-12 students and college students. Articles in which the participants were non-school students, such as social workers or adult learners, were excluded.

The research results of the included studies must mention definite signs that may be utilized to gauge critical thinking’s impact (e.g., sample size, mean value, or standard deviation). Articles that lacked specific measurement indicators for critical thinking and could not calculate the effect size were excluded.

Data coding design

In order to perform a meta-analysis, it is necessary to collect the most important information from the articles, codify that information’s properties, and convert descriptive data into quantitative data. Therefore, this study designed a data coding template (see Table 1 ). Ultimately, 16 coding fields were retained.

The designed data-coding template consisted of three pieces of information. Basic information about the papers was included in the descriptive information: the publishing year, author, serial number, and title of the paper.

The variable information for the experimental design had three variables: the independent variable (instruction method), the dependent variable (critical thinking), and the moderating variable (learning stage, teaching type, intervention duration, learning scaffold, group size, measuring tool, and subject area). Depending on the topic of this study, the intervention strategy, as the independent variable, was coded into collaborative and non-collaborative problem-solving. The dependent variable, critical thinking, was coded as a cognitive skill and an attitudinal tendency. And seven moderating variables were created by grouping and combining the experimental design variables discovered within the 36 studies (see Table 1 ), where learning stages were encoded as higher education, high school, middle school, and primary school or lower; teaching types were encoded as mixed courses, integrated courses, and independent courses; intervention durations were encoded as 0–1 weeks, 1–4 weeks, 4–12 weeks, and more than 12 weeks; group sizes were encoded as 2–3 persons, 4–6 persons, 7–10 persons, and more than 10 persons; learning scaffolds were encoded as teacher-supported learning scaffold, technique-supported learning scaffold, and resource-supported learning scaffold; measuring tools were encoded as standardized measurement tools (e.g., WGCTA, CCTT, CCTST, and CCTDI) and self-adapting measurement tools (e.g., modified or made by researchers); and subject areas were encoded according to the specific subjects used in the 36 included studies.

The data information contained three metrics for measuring critical thinking: sample size, average value, and standard deviation. It is vital to remember that studies with various experimental designs frequently adopt various formulas to determine the effect size. And this paper used Morris’ proposed standardized mean difference (SMD) calculation formula ( 2008 , p. 369; see Supplementary Table S3 ).

Procedure for extracting and coding data

According to the data coding template (see Table 1 ), the 36 papers’ information was retrieved by two researchers, who then entered them into Excel (see Supplementary Table S1 ). The results of each study were extracted separately in the data extraction procedure if an article contained numerous studies on critical thinking, or if a study assessed different critical thinking dimensions. For instance, Tiwari et al. ( 2010 ) used four time points, which were viewed as numerous different studies, to examine the outcomes of critical thinking, and Chen ( 2013 ) included the two outcome variables of attitudinal tendency and cognitive skills, which were regarded as two studies. After discussion and negotiation during data extraction, the two researchers’ consistency test coefficients were roughly 93.27%. Supplementary Table S2 details the key characteristics of the 36 included articles with 79 effect quantities, including descriptive information (e.g., the publishing year, author, serial number, and title of the paper), variable information (e.g., independent variables, dependent variables, and moderating variables), and data information (e.g., mean values, standard deviations, and sample size). Following that, testing for publication bias and heterogeneity was done on the sample data using the Rev-Man 5.4 software, and then the test results were used to conduct a meta-analysis.

Publication bias test

When the sample of studies included in a meta-analysis does not accurately reflect the general status of research on the relevant subject, publication bias is said to be exhibited in this research. The reliability and accuracy of the meta-analysis may be impacted by publication bias. Due to this, the meta-analysis needs to check the sample data for publication bias (Stewart et al., 2006 ). A popular method to check for publication bias is the funnel plot; and it is unlikely that there will be publishing bias when the data are equally dispersed on either side of the average effect size and targeted within the higher region. The data are equally dispersed within the higher portion of the efficient zone, consistent with the funnel plot connected with this analysis (see Fig. 2 ), indicating that publication bias is unlikely in this situation.

figure 2

This funnel plot shows the result of publication bias of 79 effect quantities across 36 studies.

Heterogeneity test

To select the appropriate effect models for the meta-analysis, one might use the results of a heterogeneity test on the data effect sizes. In a meta-analysis, it is common practice to gauge the degree of data heterogeneity using the I 2 value, and I 2  ≥ 50% is typically understood to denote medium-high heterogeneity, which calls for the adoption of a random effect model; if not, a fixed effect model ought to be applied (Lipsey and Wilson, 2001 ). The findings of the heterogeneity test in this paper (see Table 2 ) revealed that I 2 was 86% and displayed significant heterogeneity ( P  < 0.01). To ensure accuracy and reliability, the overall effect size ought to be calculated utilizing the random effect model.

The analysis of the overall effect size

This meta-analysis utilized a random effect model to examine 79 effect quantities from 36 studies after eliminating heterogeneity. In accordance with Cohen’s criterion (Cohen, 1992 ), it is abundantly clear from the analysis results, which are shown in the forest plot of the overall effect (see Fig. 3 ), that the cumulative impact size of cooperative problem-solving is 0.82, which is statistically significant ( z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]), and can encourage learners to practice critical thinking.

figure 3

This forest plot shows the analysis result of the overall effect size across 36 studies.

In addition, this study examined two distinct dimensions of critical thinking to better understand the precise contributions that collaborative problem-solving makes to the growth of critical thinking. The findings (see Table 3 ) indicate that collaborative problem-solving improves cognitive skills (ES = 0.70) and attitudinal tendency (ES = 1.17), with significant intergroup differences (chi 2  = 7.95, P  < 0.01). Although collaborative problem-solving improves both dimensions of critical thinking, it is essential to point out that the improvements in students’ attitudinal tendency are much more pronounced and have a significant comprehensive effect (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]), whereas gains in learners’ cognitive skill are slightly improved and are just above average. (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

The analysis of moderator effect size

The whole forest plot’s 79 effect quantities underwent a two-tailed test, which revealed significant heterogeneity ( I 2  = 86%, z  = 12.78, P  < 0.01), indicating differences between various effect sizes that may have been influenced by moderating factors other than sampling error. Therefore, exploring possible moderating factors that might produce considerable heterogeneity was done using subgroup analysis, such as the learning stage, learning scaffold, teaching type, group size, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, in order to further explore the key factors that influence critical thinking. The findings (see Table 4 ) indicate that various moderating factors have advantageous effects on critical thinking. In this situation, the subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), learning scaffold (chi 2  = 9.03, P  < 0.01), and teaching type (chi 2  = 7.20, P  < 0.05) are all significant moderators that can be applied to support the cultivation of critical thinking. However, since the learning stage and the measuring tools did not significantly differ among intergroup (chi 2  = 3.15, P  = 0.21 > 0.05, and chi 2  = 0.08, P  = 0.78 > 0.05), we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving. These are the precise outcomes, as follows:

Various learning stages influenced critical thinking positively, without significant intergroup differences (chi 2  = 3.15, P  = 0.21 > 0.05). High school was first on the list of effect sizes (ES = 1.36, P  < 0.01), then higher education (ES = 0.78, P  < 0.01), and middle school (ES = 0.73, P  < 0.01). These results show that, despite the learning stage’s beneficial influence on cultivating learners’ critical thinking, we are unable to explain why it is essential for cultivating critical thinking in the context of collaborative problem-solving.

Different teaching types had varying degrees of positive impact on critical thinking, with significant intergroup differences (chi 2  = 7.20, P  < 0.05). The effect size was ranked as follows: mixed courses (ES = 1.34, P  < 0.01), integrated courses (ES = 0.81, P  < 0.01), and independent courses (ES = 0.27, P  < 0.01). These results indicate that the most effective approach to cultivate critical thinking utilizing collaborative problem solving is through the teaching type of mixed courses.

Various intervention durations significantly improved critical thinking, and there were significant intergroup differences (chi 2  = 12.18, P  < 0.01). The effect sizes related to this variable showed a tendency to increase with longer intervention durations. The improvement in critical thinking reached a significant level (ES = 0.85, P  < 0.01) after more than 12 weeks of training. These findings indicate that the intervention duration and critical thinking’s impact are positively correlated, with a longer intervention duration having a greater effect.

Different learning scaffolds influenced critical thinking positively, with significant intergroup differences (chi 2  = 9.03, P  < 0.01). The resource-supported learning scaffold (ES = 0.69, P  < 0.01) acquired a medium-to-higher level of impact, the technique-supported learning scaffold (ES = 0.63, P  < 0.01) also attained a medium-to-higher level of impact, and the teacher-supported learning scaffold (ES = 0.92, P  < 0.01) displayed a high level of significant impact. These results show that the learning scaffold with teacher support has the greatest impact on cultivating critical thinking.

Various group sizes influenced critical thinking positively, and the intergroup differences were statistically significant (chi 2  = 8.77, P  < 0.05). Critical thinking showed a general declining trend with increasing group size. The overall effect size of 2–3 people in this situation was the biggest (ES = 0.99, P  < 0.01), and when the group size was greater than 7 people, the improvement in critical thinking was at the lower-middle level (ES < 0.5, P  < 0.01). These results show that the impact on critical thinking is positively connected with group size, and as group size grows, so does the overall impact.

Various measuring tools influenced critical thinking positively, with significant intergroup differences (chi 2  = 0.08, P  = 0.78 > 0.05). In this situation, the self-adapting measurement tools obtained an upper-medium level of effect (ES = 0.78), whereas the complete effect size of the standardized measurement tools was the largest, achieving a significant level of effect (ES = 0.84, P  < 0.01). These results show that, despite the beneficial influence of the measuring tool on cultivating critical thinking, we are unable to explain why it is crucial in fostering the growth of critical thinking by utilizing the approach of collaborative problem-solving.

Different subject areas had a greater impact on critical thinking, and the intergroup differences were statistically significant (chi 2  = 13.36, P  < 0.05). Mathematics had the greatest overall impact, achieving a significant level of effect (ES = 1.68, P  < 0.01), followed by science (ES = 1.25, P  < 0.01) and medical science (ES = 0.87, P  < 0.01), both of which also achieved a significant level of effect. Programming technology was the least effective (ES = 0.39, P  < 0.01), only having a medium-low degree of effect compared to education (ES = 0.72, P  < 0.01) and other fields (such as language, art, and social sciences) (ES = 0.58, P  < 0.01). These results suggest that scientific fields (e.g., mathematics, science) may be the most effective subject areas for cultivating critical thinking utilizing the approach of collaborative problem-solving.

The effectiveness of collaborative problem solving with regard to teaching critical thinking

According to this meta-analysis, using collaborative problem-solving as an intervention strategy in critical thinking teaching has a considerable amount of impact on cultivating learners’ critical thinking as a whole and has a favorable promotional effect on the two dimensions of critical thinking. According to certain studies, collaborative problem solving, the most frequently used critical thinking teaching strategy in curriculum instruction can considerably enhance students’ critical thinking (e.g., Liang et al., 2017 ; Liu et al., 2020 ; Cindy, 2004 ). This meta-analysis provides convergent data support for the above research views. Thus, the findings of this meta-analysis not only effectively address the first research query regarding the overall effect of cultivating critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills) utilizing the approach of collaborative problem-solving, but also enhance our confidence in cultivating critical thinking by using collaborative problem-solving intervention approach in the context of classroom teaching.

Furthermore, the associated improvements in attitudinal tendency are much stronger, but the corresponding improvements in cognitive skill are only marginally better. According to certain studies, cognitive skill differs from the attitudinal tendency in classroom instruction; the cultivation and development of the former as a key ability is a process of gradual accumulation, while the latter as an attitude is affected by the context of the teaching situation (e.g., a novel and exciting teaching approach, challenging and rewarding tasks) (Halpern, 2001 ; Wei and Hong, 2022 ). Collaborative problem-solving as a teaching approach is exciting and interesting, as well as rewarding and challenging; because it takes the learners as the focus and examines problems with poor structure in real situations, and it can inspire students to fully realize their potential for problem-solving, which will significantly improve their attitudinal tendency toward solving problems (Liu et al., 2020 ). Similar to how collaborative problem-solving influences attitudinal tendency, attitudinal tendency impacts cognitive skill when attempting to solve a problem (Liu et al., 2020 ; Zhang et al., 2022 ), and stronger attitudinal tendencies are associated with improved learning achievement and cognitive ability in students (Sison, 2008 ; Zhang et al., 2022 ). It can be seen that the two specific dimensions of critical thinking as well as critical thinking as a whole are affected by collaborative problem-solving, and this study illuminates the nuanced links between cognitive skills and attitudinal tendencies with regard to these two dimensions of critical thinking. To fully develop students’ capacity for critical thinking, future empirical research should pay closer attention to cognitive skills.

The moderating effects of collaborative problem solving with regard to teaching critical thinking

In order to further explore the key factors that influence critical thinking, exploring possible moderating effects that might produce considerable heterogeneity was done using subgroup analysis. The findings show that the moderating factors, such as the teaching type, learning stage, group size, learning scaffold, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, could all support the cultivation of collaborative problem-solving in critical thinking. Among them, the effect size differences between the learning stage and measuring tool are not significant, which does not explain why these two factors are crucial in supporting the cultivation of critical thinking utilizing the approach of collaborative problem-solving.

In terms of the learning stage, various learning stages influenced critical thinking positively without significant intergroup differences, indicating that we are unable to explain why it is crucial in fostering the growth of critical thinking.

Although high education accounts for 70.89% of all empirical studies performed by researchers, high school may be the appropriate learning stage to foster students’ critical thinking by utilizing the approach of collaborative problem-solving since it has the largest overall effect size. This phenomenon may be related to student’s cognitive development, which needs to be further studied in follow-up research.

With regard to teaching type, mixed course teaching may be the best teaching method to cultivate students’ critical thinking. Relevant studies have shown that in the actual teaching process if students are trained in thinking methods alone, the methods they learn are isolated and divorced from subject knowledge, which is not conducive to their transfer of thinking methods; therefore, if students’ thinking is trained only in subject teaching without systematic method training, it is challenging to apply to real-world circumstances (Ruggiero, 2012 ; Hu and Liu, 2015 ). Teaching critical thinking as mixed course teaching in parallel to other subject teachings can achieve the best effect on learners’ critical thinking, and explicit critical thinking instruction is more effective than less explicit critical thinking instruction (Bensley and Spero, 2014 ).

In terms of the intervention duration, with longer intervention times, the overall effect size shows an upward tendency. Thus, the intervention duration and critical thinking’s impact are positively correlated. Critical thinking, as a key competency for students in the 21st century, is difficult to get a meaningful improvement in a brief intervention duration. Instead, it could be developed over a lengthy period of time through consistent teaching and the progressive accumulation of knowledge (Halpern, 2001 ; Hu and Liu, 2015 ). Therefore, future empirical studies ought to take these restrictions into account throughout a longer period of critical thinking instruction.

With regard to group size, a group size of 2–3 persons has the highest effect size, and the comprehensive effect size decreases with increasing group size in general. This outcome is in line with some research findings; as an example, a group composed of two to four members is most appropriate for collaborative learning (Schellens and Valcke, 2006 ). However, the meta-analysis results also indicate that once the group size exceeds 7 people, small groups cannot produce better interaction and performance than large groups. This may be because the learning scaffolds of technique support, resource support, and teacher support improve the frequency and effectiveness of interaction among group members, and a collaborative group with more members may increase the diversity of views, which is helpful to cultivate critical thinking utilizing the approach of collaborative problem-solving.

With regard to the learning scaffold, the three different kinds of learning scaffolds can all enhance critical thinking. Among them, the teacher-supported learning scaffold has the largest overall effect size, demonstrating the interdependence of effective learning scaffolds and collaborative problem-solving. This outcome is in line with some research findings; as an example, a successful strategy is to encourage learners to collaborate, come up with solutions, and develop critical thinking skills by using learning scaffolds (Reiser, 2004 ; Xu et al., 2022 ); learning scaffolds can lower task complexity and unpleasant feelings while also enticing students to engage in learning activities (Wood et al., 2006 ); learning scaffolds are designed to assist students in using learning approaches more successfully to adapt the collaborative problem-solving process, and the teacher-supported learning scaffolds have the greatest influence on critical thinking in this process because they are more targeted, informative, and timely (Xu et al., 2022 ).

With respect to the measuring tool, despite the fact that standardized measurement tools (such as the WGCTA, CCTT, and CCTST) have been acknowledged as trustworthy and effective by worldwide experts, only 54.43% of the research included in this meta-analysis adopted them for assessment, and the results indicated no intergroup differences. These results suggest that not all teaching circumstances are appropriate for measuring critical thinking using standardized measurement tools. “The measuring tools for measuring thinking ability have limits in assessing learners in educational situations and should be adapted appropriately to accurately assess the changes in learners’ critical thinking.”, according to Simpson and Courtney ( 2002 , p. 91). As a result, in order to more fully and precisely gauge how learners’ critical thinking has evolved, we must properly modify standardized measuring tools based on collaborative problem-solving learning contexts.

With regard to the subject area, the comprehensive effect size of science departments (e.g., mathematics, science, medical science) is larger than that of language arts and social sciences. Some recent international education reforms have noted that critical thinking is a basic part of scientific literacy. Students with scientific literacy can prove the rationality of their judgment according to accurate evidence and reasonable standards when they face challenges or poorly structured problems (Kyndt et al., 2013 ), which makes critical thinking crucial for developing scientific understanding and applying this understanding to practical problem solving for problems related to science, technology, and society (Yore et al., 2007 ).

Suggestions for critical thinking teaching

Other than those stated in the discussion above, the following suggestions are offered for critical thinking instruction utilizing the approach of collaborative problem-solving.

First, teachers should put a special emphasis on the two core elements, which are collaboration and problem-solving, to design real problems based on collaborative situations. This meta-analysis provides evidence to support the view that collaborative problem-solving has a strong synergistic effect on promoting students’ critical thinking. Asking questions about real situations and allowing learners to take part in critical discussions on real problems during class instruction are key ways to teach critical thinking rather than simply reading speculative articles without practice (Mulnix, 2012 ). Furthermore, the improvement of students’ critical thinking is realized through cognitive conflict with other learners in the problem situation (Yang et al., 2008 ). Consequently, it is essential for teachers to put a special emphasis on the two core elements, which are collaboration and problem-solving, and design real problems and encourage students to discuss, negotiate, and argue based on collaborative problem-solving situations.

Second, teachers should design and implement mixed courses to cultivate learners’ critical thinking, utilizing the approach of collaborative problem-solving. Critical thinking can be taught through curriculum instruction (Kuncel, 2011 ; Leng and Lu, 2020 ), with the goal of cultivating learners’ critical thinking for flexible transfer and application in real problem-solving situations. This meta-analysis shows that mixed course teaching has a highly substantial impact on the cultivation and promotion of learners’ critical thinking. Therefore, teachers should design and implement mixed course teaching with real collaborative problem-solving situations in combination with the knowledge content of specific disciplines in conventional teaching, teach methods and strategies of critical thinking based on poorly structured problems to help students master critical thinking, and provide practical activities in which students can interact with each other to develop knowledge construction and critical thinking utilizing the approach of collaborative problem-solving.

Third, teachers should be more trained in critical thinking, particularly preservice teachers, and they also should be conscious of the ways in which teachers’ support for learning scaffolds can promote critical thinking. The learning scaffold supported by teachers had the greatest impact on learners’ critical thinking, in addition to being more directive, targeted, and timely (Wood et al., 2006 ). Critical thinking can only be effectively taught when teachers recognize the significance of critical thinking for students’ growth and use the proper approaches while designing instructional activities (Forawi, 2016 ). Therefore, with the intention of enabling teachers to create learning scaffolds to cultivate learners’ critical thinking utilizing the approach of collaborative problem solving, it is essential to concentrate on the teacher-supported learning scaffolds and enhance the instruction for teaching critical thinking to teachers, especially preservice teachers.

Implications and limitations

There are certain limitations in this meta-analysis, but future research can correct them. First, the search languages were restricted to English and Chinese, so it is possible that pertinent studies that were written in other languages were overlooked, resulting in an inadequate number of articles for review. Second, these data provided by the included studies are partially missing, such as whether teachers were trained in the theory and practice of critical thinking, the average age and gender of learners, and the differences in critical thinking among learners of various ages and genders. Third, as is typical for review articles, more studies were released while this meta-analysis was being done; therefore, it had a time limit. With the development of relevant research, future studies focusing on these issues are highly relevant and needed.

Conclusions

The subject of the magnitude of collaborative problem-solving’s impact on fostering students’ critical thinking, which received scant attention from other studies, was successfully addressed by this study. The question of the effectiveness of collaborative problem-solving in promoting students’ critical thinking was addressed in this study, which addressed a topic that had gotten little attention in earlier research. The following conclusions can be made:

Regarding the results obtained, collaborative problem solving is an effective teaching approach to foster learners’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]). With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students’ attitudinal tendency, and the comprehensive effect is significant (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

As demonstrated by both the results and the discussion, there are varying degrees of beneficial effects on students’ critical thinking from all seven moderating factors, which were found across 36 studies. In this context, the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have a positive impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. Since the learning stage (chi 2  = 3.15, P  = 0.21 > 0.05) and measuring tools (chi 2  = 0.08, P  = 0.78 > 0.05) did not demonstrate any significant intergroup differences, we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving.

Data availability

All data generated or analyzed during this study are included within the article and its supplementary information files, and the supplementary information files are available in the Dataverse repository: https://doi.org/10.7910/DVN/IPFJO6 .

Bensley DA, Spero RA (2014) Improving critical thinking skills and meta-cognitive monitoring through direct infusion. Think Skills Creat 12:55–68. https://doi.org/10.1016/j.tsc.2014.02.001

Article   Google Scholar  

Castle A (2009) Defining and assessing critical thinking skills for student radiographers. Radiography 15(1):70–76. https://doi.org/10.1016/j.radi.2007.10.007

Chen XD (2013) An empirical study on the influence of PBL teaching model on critical thinking ability of non-English majors. J PLA Foreign Lang College 36 (04):68–72

Google Scholar  

Cohen A (1992) Antecedents of organizational commitment across occupational groups: a meta-analysis. J Organ Behav. https://doi.org/10.1002/job.4030130602

Cooper H (2010) Research synthesis and meta-analysis: a step-by-step approach, 4th edn. Sage, London, England

Cindy HS (2004) Problem-based learning: what and how do students learn? Educ Psychol Rev 51(1):31–39

Duch BJ, Gron SD, Allen DE (2001) The power of problem-based learning: a practical “how to” for teaching undergraduate courses in any discipline. Stylus Educ Sci 2:190–198

Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189x018003004

Facione PA (1990) Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Eric document reproduction service. https://eric.ed.gov/?id=ed315423

Facione PA, Facione NC (1992) The California Critical Thinking Dispositions Inventory (CCTDI) and the CCTDI test manual. California Academic Press, Millbrae, CA

Forawi SA (2016) Standard-based science education and critical thinking. Think Skills Creat 20:52–62. https://doi.org/10.1016/j.tsc.2016.02.005

Halpern DF (2001) Assessing the effectiveness of critical thinking instruction. J Gen Educ 50(4):270–286. https://doi.org/10.2307/27797889

Hu WP, Liu J (2015) Cultivation of pupils’ thinking ability: a five-year follow-up study. Psychol Behav Res 13(05):648–654. https://doi.org/10.3969/j.issn.1672-0628.2015.05.010

Huber K (2016) Does college teach critical thinking? A meta-analysis. Rev Educ Res 86(2):431–468. https://doi.org/10.3102/0034654315605917

Kek MYCA, Huijser H (2011) The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. High Educ Res Dev 30(3):329–341. https://doi.org/10.1080/07294360.2010.501074

Kuncel NR (2011) Measurement and meaning of critical thinking (Research report for the NRC 21st Century Skills Workshop). National Research Council, Washington, DC

Kyndt E, Raes E, Lismont B, Timmers F, Cascallar E, Dochy F (2013) A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educ Res Rev 10(2):133–149. https://doi.org/10.1016/j.edurev.2013.02.002

Leng J, Lu XX (2020) Is critical thinking really teachable?—A meta-analysis based on 79 experimental or quasi experimental studies. Open Educ Res 26(06):110–118. https://doi.org/10.13966/j.cnki.kfjyyj.2020.06.011

Liang YZ, Zhu K, Zhao CL (2017) An empirical study on the depth of interaction promoted by collaborative problem solving learning activities. J E-educ Res 38(10):87–92. https://doi.org/10.13811/j.cnki.eer.2017.10.014

Lipsey M, Wilson D (2001) Practical meta-analysis. International Educational and Professional, London, pp. 92–160

Liu Z, Wu W, Jiang Q (2020) A study on the influence of problem based learning on college students’ critical thinking-based on a meta-analysis of 31 studies. Explor High Educ 03:43–49

Morris SB (2008) Estimating effect sizes from pretest-posttest-control group designs. Organ Res Methods 11(2):364–386. https://doi.org/10.1177/1094428106291059

Article   ADS   Google Scholar  

Mulnix JW (2012) Thinking critically about critical thinking. Educ Philos Theory 44(5):464–479. https://doi.org/10.1111/j.1469-5812.2010.00673.x

Naber J, Wyatt TH (2014) The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students. Nurse Educ Today 34(1):67–72. https://doi.org/10.1016/j.nedt.2013.04.002

National Research Council (2012) Education for life and work: developing transferable knowledge and skills in the 21st century. The National Academies Press, Washington, DC

Niu L, Behar HLS, Garvan CW (2013) Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 9(12):114–128. https://doi.org/10.1016/j.edurev.2012.12.002

Peng ZM, Deng L (2017) Towards the core of education reform: cultivating critical thinking skills as the core of skills in the 21st century. Res Educ Dev 24:57–63. https://doi.org/10.14121/j.cnki.1008-3855.2017.24.011

Reiser BJ (2004) Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J Learn Sci 13(3):273–304. https://doi.org/10.1207/s15327809jls1303_2

Ruggiero VR (2012) The art of thinking: a guide to critical and creative thought, 4th edn. Harper Collins College Publishers, New York

Schellens T, Valcke M (2006) Fostering knowledge construction in university students through asynchronous discussion groups. Comput Educ 46(4):349–370. https://doi.org/10.1016/j.compedu.2004.07.010

Sendag S, Odabasi HF (2009) Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 53(1):132–141. https://doi.org/10.1016/j.compedu.2009.01.008

Sison R (2008) Investigating Pair Programming in a Software Engineering Course in an Asian Setting. 2008 15th Asia-Pacific Software Engineering Conference, pp. 325–331. https://doi.org/10.1109/APSEC.2008.61

Simpson E, Courtney M (2002) Critical thinking in nursing education: literature review. Mary Courtney 8(2):89–98

Stewart L, Tierney J, Burdett S (2006) Do systematic reviews based on individual patient data offer a means of circumventing biases associated with trial publications? Publication bias in meta-analysis. John Wiley and Sons Inc, New York, pp. 261–286

Tiwari A, Lai P, So M, Yuen K (2010) A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 40(6):547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x

Wood D, Bruner JS, Ross G (2006) The role of tutoring in problem solving. J Child Psychol Psychiatry 17(2):89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x

Wei T, Hong S (2022) The meaning and realization of teachable critical thinking. Educ Theory Practice 10:51–57

Xu EW, Wang W, Wang QX (2022) A meta-analysis of the effectiveness of programming teaching in promoting K-12 students’ computational thinking. Educ Inf Technol. https://doi.org/10.1007/s10639-022-11445-2

Yang YC, Newby T, Bill R (2008) Facilitating interactions through structured web-based bulletin boards: a quasi-experimental study on promoting learners’ critical thinking skills. Comput Educ 50(4):1572–1585. https://doi.org/10.1016/j.compedu.2007.04.006

Yore LD, Pimm D, Tuan HL (2007) The literacy component of mathematical and scientific literacy. Int J Sci Math Educ 5(4):559–589. https://doi.org/10.1007/s10763-007-9089-4

Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners’ critical thinking in online peer review. Audio Visual Educ Res 6:53–60. https://doi.org/10.13811/j.cnki.eer.2022.06.08

Download references

Acknowledgements

This research was supported by the graduate scientific research and innovation project of Xinjiang Uygur Autonomous Region named “Research on in-depth learning of high school information technology courses for the cultivation of computing thinking” (No. XJ2022G190) and the independent innovation fund project for doctoral students of the College of Educational Science of Xinjiang Normal University named “Research on project-based teaching of high school information technology courses from the perspective of discipline core literacy” (No. XJNUJKYA2003).

Author information

Authors and affiliations.

College of Educational Science, Xinjiang Normal University, 830017, Urumqi, Xinjiang, China

Enwei Xu, Wei Wang & Qingxia Wang

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Enwei Xu or Wei Wang .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary tables, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Xu, E., Wang, W. & Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit Soc Sci Commun 10 , 16 (2023). https://doi.org/10.1057/s41599-023-01508-1

Download citation

Received : 07 August 2022

Accepted : 04 January 2023

Published : 11 January 2023

DOI : https://doi.org/10.1057/s41599-023-01508-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Exploring the effects of digital technology on deep learning: a meta-analysis.

Education and Information Technologies (2024)

Impacts of online collaborative learning on students’ intercultural communication apprehension and intercultural communicative competence

  • Hoa Thi Hoang Chau
  • Hung Phu Bui
  • Quynh Thi Huong Dinh

Education and Information Technologies (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

collaborative problem solving process

How to ace collaborative problem solving

April 30, 2023 They say two heads are better than one, but is that true when it comes to solving problems in the workplace? To solve any problem—whether personal (eg, deciding where to live), business-related (eg, raising product prices), or societal (eg, reversing the obesity epidemic)—it’s crucial to first define the problem. In a team setting, that translates to establishing a collective understanding of the problem, awareness of context, and alignment of stakeholders. “Both good strategy and good problem solving involve getting clarity about the problem at hand, being able to disaggregate it in some way, and setting priorities,” Rob McLean, McKinsey director emeritus, told McKinsey senior partner Chris Bradley  in an Inside the Strategy Room podcast episode . Check out these insights to uncover how your team can come up with the best solutions for the most complex challenges by adopting a methodical and collaborative approach. 

Want better strategies? Become a bulletproof problem solver

How to master the seven-step problem-solving process

Countering otherness: Fostering integration within teams

Psychological safety and the critical role of leadership development

If we’re all so busy, why isn’t anything getting done?

To weather a crisis, build a network of teams

Unleash your team’s full potential

Modern marketing: Six capabilities for multidisciplinary teams

Beyond collaboration overload

MORE FROM MCKINSEY

Take a step Forward

collaborative problem solving process

Search form

collaborative problem solving process

  • Table of Contents
  • Troubleshooting Guide
  • A Model for Getting Started
  • Justice Action Toolkit
  • Coronavirus Response Tool Box
  • Best Change Processes
  • Databases of Best Practices
  • Online Courses
  • Ask an Advisor
  • Subscribe to eNewsletter
  • Community Stories
  • YouTube Channel
  • About the Tool Box
  • How to Use the Tool Box
  • Privacy Statement
  • Workstation/Check Box Sign-In
  • Online Training Courses
  • Capacity Building Training
  • Training Curriculum - Order Now
  • Community Check Box Evaluation System
  • Build Your Toolbox
  • Facilitation of Community Processes
  • Community Health Assessment and Planning
  • Section 11. Collaborative Leadership

Chapter 13 Sections

  • Section 1. Developing a Plan for Building Leadership
  • Section 2. Servant Leadership: Accepting and Maintaining the Call of Service
  • Section 3. Styles of Leadership
  • Section 4. Building Teams: Broadening the Base for Leadership
  • Section 5. Developing a Community Leadership Corps: A Model for Service-Learning
  • Section 6. Recognizing the Challenges of Leadership
  • Section 7. Encouraging Leadership Development Across the Life Span
  • Section 8. Ethical Leadership
  • Main Section

A collaboration among several groups and individuals is often needed to address a complex issue, and collaboration requires collaborative leadership. Collaborative leadership means maintaining a process that includes everyone involved in an issue or organization. A process that depends on collaborative problem solving and decision making. In this section, we will explore collaborative leadership, why it is useful, and how to practice in effectively.

What is collaborative leadership?

Collaborative leadership is really defined by a process, rather than by what leaders do. It has much in common with both servant leadership and transformational leadership . It starts, according to David Chrislip and Carl Larson, in Collaborative Leadership , from the premise that "...if you bring the appropriate people together in constructive ways with good information, they will create authentic visions and strategies for addressing the shared concerns of the organization or community."

Collaborative leadership can be employed in almost any situation, and indeed is practiced in some businesses with great success, but is seen more often in community coalitions and initiatives, in community-based health and human service organizations, or in alternative education. People often find it particularly useful in situations where "no one is in charge," where there are issues or problems so complex that no one person or entity has either the information or the power to change them. (This does't mean that no one has responsibility , but rather that sharing responsibility for the issue is necessary in order to arrive at a successful resolution of it.)

While it can be practiced in a number of ways, good collaborative leadership is almost always characterized by some specific traits. Among the most important:

  • Collaborative problem-solving and decision-making. It's not the leader's job to decide what to do and then tell the group. Rather, the group considers the problem, decides what to do, and counts on the leader to help them focus their effort.
  • Open process. The leader - or some other interested party, like Putnam's mayor - doesn't just start with his goals in mind and steer the group in that direction. Collaborative leadership means that the process of decision-making is truly collaborative, and has no set end-point when it begins. The end result is worked out among all the participants: that's collaboration.
  • Leadership of the process, rather than the group. The purpose of collaborative leadership is to help the collaborative process work, rather than to lead the people involved toward something - to a particular decision, for instance, or in a particular direction.

There are some differences between collaborative leadership within an organization and collaborative leadership among organizations. In the first case, a leader may have to spend much of her time initially trying to coax people to take leadership roles in certain circumstances, or even to participate in collaborative decision-making. In the second instance, a leader's biggest task may be to keep everyone from trying to lead in different directions all at once.

There are really two ways to define collaborative leadership. In this section, we will focus on the first of these situations, though the orienting principles are the same in both cases. Collaborative leadership: leadership of a collaborative effort . This definition refers to taking a leadership role in a coalition, organization, or other enterprise where everyone is on an equal footing and working together to solve a problem, create something new, or run an organization or initiative. The leader is not in control of the group, but has responsibility for guiding and coordinating the process by which the group decides upon and carries out actions to accomplish its goals. Leading collaboratively: leadership as a collaborative effort . In this case - usually in an organizational rather than a coalition or community setting - leadership may shift, by group decision, from one person to another as different talents or abilities are called for, or (more often) leadership is permanently shared by all, or several, members of the group. Here, there is no one leader: the group functions as a true collaborative, and guides itself.

Why practice collaborative leadership?

A coalition or other collaboration will nearly always function best with collaborative leadership. Most other organizations and enterprises may function without collaborative leadership, but there are benefits that collaborative leadership can confer even in situations where there are other possible choices.

Advantages of collaborative leadership include:

  • Buy-in . Collaborative leadership encourages ownership of the enterprise, whether it's a coalition, an organization, a business, or a community project. By involving everyone in decision making and problem solving, it makes what people are doing theirs, rather than something imposed on them by someone else. The sense of ownership builds commitment to the common purpose.
  • More involvement in implementation . Members of a collaborative group are more likely to be willing to take responsibility for implementing the group's action plan, because they were part of developing it.
  • Trust building . Collaborative leadership, by its use of an open process and its encouragement of discussion and dialogue, builds trust among those involved in the enterprise.
  • Elimination of turf issues . Similarly, collaborative leadership can help to address turf issues through establishing mutual trust, making sure everyone's concerns are heard, and helping organizations, factions, or individuals find common ground and work together.
Turf issues arise when individuals or organizations feel someone else is invading their "turf," their professional or philosophical or personal territory. In a community, this can mean competition among organizations for prestige, credibility with a target population, or - worst of all - funding, and can result in organizations that should be natural allies working against one another. In an organization, it can mean individuals asserting "ownership" of information, the use of equipment, or administrative procedures, and can cause disastrous splits among staff and ineffective and inefficient operation.
  • Access to more and better information and ideas . When all involved in an issue are party to addressing it, they bring with them a wealth of information, as well as a variety of perspectives. As a result, the solutions they arrive at are likely to be better than those developed in a vacuum, or by only a small number of people.
  • Better opportunity for substantive results . The combination of ownership of the process and its results, trust, real collaboration, and better planning yields real success in the real world. In looking at successful community development efforts, Chrislip and Larson found that nearly all were characterized by collaborative leadership.
  • Generation of new leadership . Collaborative leadership helps to train new leaders from within the group, thus assuring continuity and commitment to the issues the group is addressing.
  • Community or organizational empowerment . The inclusion of all stakeholders - anyone with an interest or involvement in an issue or organization - in problem-solving and decision-making not only prepares potential leaders, but leads to people taking more responsibility and caring more about what they do. It leads to better functioning in every sphere.
  • Fundamental change for the better in the ways communities and organizations operate . Collaborative leadership breeds more collaborative leadership and more collaboration, leading to a different way of looking at solving problems. This in turn brings more willingness to find common ground and common cause with others, more willingness to tackle new issues, and more effective and wide-reaching solutions.

For all its advantages, there are disadvantages that go with collaborative leadership as well. It can be frustrating, and there's no guarantee that it will work with a particular group.

The major difficulties with collaborative leadership include:

  • It's time-consuming . Collaboration takes time, and decision-making that involves a large number of people and organizations may seem to proceed glacially - very slowly, and with a great deal of friction.
  • It demands the ability to face conflict directly and mediate it to a resolution acceptable to everyone. Collaborative leadership is not a job for people who like everything calm and who would prefer that no one ever raise her voice.
At the beginning of Chapter 13, Section 3: Styles of Leadership , there is a true story of a high school principal who tried for several years to be a collaborative leader. His overtures were roundly rejected by a majority of the school's faculty, who preferred to do what they had always done, and to know exactly what the rules were. He eventually left the school, having succeeded only marginally in convincing teachers to become more collaborative, and to take more control of their teaching.
  • It can lead to groups taking what seems to you to be the wrong path . As a collaborative leader, you have to be able to let go of your own ideas and biases, and maintain a process that will guide the group to its own goals, strategy, and action plans.
Whether or not these last two possibilities actually play out depends on the situation. In an organization, the opinions and status of a collaborative leader might still carry more weight than those of other staff members, regardless of how hard he tries to eliminate any hierarchy. In a coalition or community-wide collaboration, even though there may be more and more varied participants, it may be easier for the leader to be seen as a peer.

When is collaborative leadership appropriate?

Collaborative leadership is not always the best solution for a particular group. In the military, for instance, particularly in a combat situation, collaborative leadership would be fatal: while the group carefully worked out its plans, it would be overrun. There are numerous other situations - often related to how quickly decisions have to be made and how decisively people have to act - where collaborative leadership wouldn't work well. Time is clearly a factor, as is the ability of a group to gather and digest information, its level of experience and judgment (you wouldn't put pre-schoolers in charge of their own safety, for instance), its freedom to act, etc.

So how do you know when to employ collaborative leadership? Here are some possibilities to consider:

  • When the timing is right. Good timing is often necessary for collaborative leadership to succeed. When circumstances conspire to bring a situation to a crisis point, that can break down barriers and convince otherwise-reluctant stakeholders that they need to collaborate. By the same token, when things are going well, there may be the time, the funding, and the common will to take on a new collaborative effort.
  • When problems are serious and complex, and both affect and require attention from a number of individuals and groups . This is the kind of situation, referred to earlier, when no one is in charge. It's impossible for any one individual or group to solve the problem by tackling it alone. At the same time, the seriousness and complexity of the problem mean that it's in the self-interest of the individuals and groups involved to put turf issues and the like aside, and to collaborate on dealing with it.
  • When there are a number of diverse stakeholders, or stakeholders with varied interests . In order for these stakeholders to work together, collaborative leadership is needed to build trust - both among stakeholders and in the process - and to make sure that everyone's agenda is heard and honestly considered.
  • When other attempts at solutions haven't worked. Individual organizations or officials may have tried to deal with an issue and failed, or a coalition may have faltered because of internal conflict and/or inability to generate effective action.
  • When an issue affects a whole organization or a whole community . If everyone's affected, everyone needs a voice. Collaborative leadership can provide the opportunity for all to be heard and involved.
  • When inclusiveness and empowerment are goals of the process from the beginning . A coalition that has set out, for instance, to broaden political participation throughout the community would do well to operate with collaborative leadership and a collaborative process. Such a structure would give it credibility among those it's trying to reach, and would also provide that target population with the opportunity to develop its own voice, and to increase its ability to participate fully.

Who are collaborative leaders?

While no one walks around with a name tag saying "Hi, I'm a collaborative leader," potential and actual collaborative leaders are everywhere in a community or organization. They may be independent consultants hired for their facilitation skills, or they may emerge from unexpected places - the corner office of a powerful business, for instance, or a three-room apartment in a public housing complex. Regardless of who they are or where they come from, collaborative leaders usually have some characteristics in common.

Collaborative leaders are - or quickly become - trusted and respected by all the groups and individuals they have to deal with.

Depending on the circumstances, this may mean that they're viewed as neutral, unconnected to any of the interests involved in the collaboration, or having no prior history with any group, and therefore unbiased. Or it may mean that they have a solid reputation for fairness and integrity. It almost always means that, while they may stand to gain from the success of the collaboration, they have nothing personal to gain from their leadership position.

Collaborative leaders relate to diverse groups and individuals with respect and ease.

The necessity of approaching everyone with openness and without condescension, and of being trusted by people of diverse backgrounds and experience, make this quality a great asset for a collaborative leader.

Collaborative leaders have good facilitation skills.

Because they have to deal with whatever comes up in the collaborative process, collaborative leaders have to be skilled at facilitating more than meetings. Facilitation skills include:

  • A tolerance for and understanding of how to use conflict.
  • The ability to involve everyone and make sure all voices are heard.
  • The capacity to restate arguments, ideas, or issues so that everyone's clear on them. This includes the gift for reframing debate to disarm or enlist as allies many who might otherwise be opponents.
  • An understanding of group process.
The words "group process" often conjure up graduate school courses and psycho-social models of how a group works. Some people may not have this educational or professional background, but have an intrinsic understanding of what's happening in a group, and of how to intervene to address whatever needs to be addressed. If that's the case, groups quickly learn to trust their judgment.
  • An ability to see the big picture. A good facilitator can both view the process that the group is going through, and consider and act on it in light of what's needed to realize the group's goals.

Collaborative leaders are catalysts.

They bring the right people together at the right time to make things happen, and continue to sustain the process that will lead the collaborative to success.

Collaborative leaders nurture new leadership within the collaboration and the community.

Rather than trying to protect their leadership positions, good collaborative leaders encourage potential leaders. They provide opportunities for them to hone their leadership skills, and afford mentoring and support. Collaborative leaders know that new leadership is the life breath of a collaboration.

Collaborative leaders have a commitment to the collaborative process and to finding real solutions to problems.

Good collaborative leaders have to believe in the process, and to champion and maintain it, often in the face of strong opposition. At the same time, they have to keep everyone moving toward the group's goals, even when it feels like nothing's happening.

Collaborative leaders keep the focus on what's best for the group, organization, or community as a whole.

Just as the leader has to be willing to let go of his ego or specific concerns, he tries to help group members learn to do the same, and to focus on solutions that address the broadest, rather than the narrowest, interests.

How do you practice collaborative leadership?

There are a number of elements that need to be mentioned in any discussion of the practice of collaborative leadership: leadership of the process; understanding of the context of leadership in a particular situation; the role of motivator; flexibility and persistence; and the importance, already referred to more than once, of the leader's willingness to put aside her own ego. We'll look at each of these elements in turn.

Lead the process, not the people.

As a collaborative leader, your most important task is not to make sure that the group comes up with the "right" ideas or plans, or to produce single-handedly the vision or goals that it needs to follow. Your main job is to establish, maintain, and safeguard the collaborative process that allows everyone to participate fully in the group's work. In order to fill your role well, there are a number of things you need to do:

Help the group set norms - for meetings, communication, and general operation - that it can live by, and that encourage respect, participation, and trust.

Norms may be stated or unstated, depending on the group and its needs, but in general, the more explicit they are, the better. They can range from, say, the formality of Robert's Rules of Order as a structure for meetings, to the arrangement of seating (chairs in a circle - often an unstated norm), to the responsibilities of particular subgroups or individuals, to guidelines for discussion (no interruption until someone's thought is finished, no name-calling, etc.)

Assure that everyone gets heard.

That means not just letting people speak in meetings, but actively soliciting the opinions of those who haven't spoken, and recording and reviewing with the group everyone's concerns and ideas as you discuss possibilities. Between meetings, it means communicating any news and developments to people on a regular basis and giving them a chance to respond, and making sure they communicate with one another.

Encourage and model inclusiveness.

As a collaborative leader, you have an obligation to invite participation from all segments of the community or organization, to welcome new participants and make sure they meet others (and to encourage other members to do the same), to include them in discussion and subgroups, to help them gain whatever skills they need to participate fully, etc. Perhaps most important, you should be instrumental in creating an atmosphere where all these things happen automatically, without your intervention.

Help people make real connections with one another.

In order to develop trust, especially in those who might formerly have been seen as competitors or enemies, people need time to get to know one another. It's up to the collaborative leader (as well as others) to make sure they get it, in an atmosphere that's safe and open. The leader must exhibit trust as well as encouraging it

As is probably obvious here, the collaborative leader must set an example by practicing what she preaches. To a large extent, the group will become what the leader models, and therefore, she must model what she wants the group to become. Modeling all the functions on this list will help a leader to institutionalize the collaborative process.

Mediate conflicts and disputes.

In any group, conflict is almost inevitable. Trying to ignore it and hoping it will go away is probably the absolute worst way to handle it. In collaborative groups, especially, it needs to be faced head-on and not only resolved, but used constructively, to build trust and further the work of the group. Creative dispute resolution is a vital function of collaborative leadership.

Help the group create and use mechanisms for soliciting ideas.

Suggesting and teaching, if necessary, such techniques as brainstorming; introducing research or other relevant ideas from outside the group; gaining the help of knowledgeable non-members (university faculty or graduate students, for instance) - these are some of the ways that a collaborative leader can assist the group to examine complex issues and come up with potential solutions.

Maintain collaborative problem-solving and decision-making.

The leader must guard against an individual, organization, or small group running away with the process. In many circumstances, it's not only reasonable but necessary to ask a small group to come up with suggestions or plans. But the larger group should instruct them to do so in the first place, and their results should come back to the larger group for discussion and approval.

Push the group toward effectiveness by:

  • Urging it to come to decisions after there's been enough discussion.
  • Helping it to devise appropriate action plans.
  • Making sure that people take and honor responsibility for implementing action plans in a timely and competent way.
  • Holding people accountable to their implementation (and other) responsibilities.
  • Reminding the group to evaluate, adjust, and reevaluate both plans and their implementation, based on results.

Help the group choose initial projects that are doable, in order to build confidence and demonstrate collaborative success.

It's important that the collaborative leader do all she can to encourage the group to take on tasks that can be accomplished with the available time and resources. Initial success will both motivate the group and give it legitimacy.

Help the group identify and obtain the necessary resources to do the work.

Insist on and protect an open process, one that has no expected outcome when it starts, no predetermined decisions demanding only the group's rubber stamp. The process should belong to the group from the very beginning.

Keep the group focused on what's best for the organization, collaborative, or community as a whole, rather than on individual interests.

Know the leadership context

The context of leadership - all the elements that affect what a leader may have to face and what will be required of him - is unique to each situation. As a collaborative leader, you need to understand your particular situation fully, so that you're not caught by surprise by a development that you could have anticipated

The community. Important factors here are:

  • The current circumstances. What are the issues that the coalition or organization is responding to, and why are they issues?
  • History. What brought the community to this point? What is its history of trying to deal with the current issues? Are there roadblocks that might be thrown up as a result of what happened in the past?
  • The stakeholders and other interested parties. What are their relationships to the issues? Perhaps more important, what are their relationships to one another? How might those relationships help or hinder the effort?
  • Community attitudes. Are there things you need to know about how most people in the community view particular issues, or about what they'll respond to and what they won't accept?

The nature of the problem. The nature of the problem can be considered in two ways. The first is problem type . Chrislip and Larson, following Ronald Heifetz and Riley Sinder, put problems in three categories:

  • Type I is an obvious, clearly-defined problem with an equally obvious, clearly-defined solution that can be exercised by an expert. (The remedy for a broken window is to replace the glass, which can be done by anyone who knows how to glaze windows.)
  • Type II is a clearly-defined problem, but one whose solution requires both an expert and effort on the part of those affected as well. (If your windows are always broken because you keep hitting baseballs into them during backyard games, they not only need glazing, but you need to take your games farther away from your windows.)
  • Type III problems have neither a clear definition nor a clear solution. (All the windows in the neighborhood are continually broken, and no one knows why.)

Barriers to collaboration. Collaborative leaders are often confronted with situations or factors that work against collaboration. It's important to anticipate the most common of these, and to be aware of some ways to eliminate them.

  • If people don't know how to work together , teach them. A community development effort in Newark, NJ, brought in a consulting firm to facilitate group building and to teach collaborative problem-solving and other techniques.
  • If there are turf issues , emphasize the benefits to everyone of collaboration. Show people that they're better off collaborating, and the chances are that they will.
  • If there's unfortunate community history , either among organizations and individuals, or with the issue itself, mediate disputes; point out the differences between now and then; point out the differences between collaboration and groups working separately; and structure the situation so that groups and individuals can interact and make connections.
  • If professionals or some other elite seem to be dominating the collaboration , work with that group to emphasize the importance of inclusiveness, while modeling it yourself. At the same time, provide support and, if necessary, training for others so that they feel more comfortable participating. Structure face-to-face situations (meetings, workshops, etc.) to equalize input from everyone.
  • If there are poor links to the community , forge new ones. Bring people together through introductions and events. Encourage organizations and groups to reach out with active solicitation of help and advice, publicity, public education, and events.
  • If there is little organizational capacity , find resources to hire a coordinator, or tap the collaboration's internal resources for one. Create, with the collaboration as a whole, structures that address this issue.
As is mentioned in many places in the Tool Box, resist applying for or accepting funding that isn't directly relevant to what the collaboration wants or needs to do, and that isn't consistent with the goals, mission, and philosophy of the group. Selling your principles will cause far more problems than the money will solve.

The group's capacity for change. Organizations, groups, and communities vary greatly in their acceptance of change in general and in their openness to particular kinds of change. It's important to start where the group is, rather than at some point which most members may see as radical or impossible. Knowing how ready a group is to try something new can mean the difference between a highly successful collaboration and a group that breaks up with recriminations and a certainty that collaboration doesn't work.

Motivate, motivate, motivate Keeping the collaboration or organization enthusiastic and eager to continue its work is a significant part of the collaborative leader's role. Being upbeat, even when things look bleak, keeping the group focused on the future and on the larger picture, and identifying and celebrating even the small successes all act to strengthen commitment and guard against discouragement and burn-out.

At the same time, the leader has to ensure that there continue to be reason for optimism and successes to celebrate by being realistic. It's also part of her job to act as a reality check, and keep the group from taking on more than it can accomplish. Success is usually incremental, step by step. In guiding those steps, and making sure that the group doesn't try to run before it can walk, the collaborative leader not only safeguards the group's effectiveness, but provides motivation as well.

Be flexible; be unyielding -

Be flexible in:

  • Trying out new ideas, and ideas from unusual or unlikely sources
  • Changing course when the situation demands it
  • Letting go of something that's not working
  • Creating opportunities for more participation
  • Protecting the integrity of the open, collaborative process
  • Inclusiveness
  • Keeping the group on track
  • Advocating for what's in the best interests of the organization or community as a whole

Be unyielding in:

Check your ego at the door As a good collaborative leader, you have to let go of your own ego, and forget about taking credit or being seen as a hero. The role calls for contributing to problem-solving and decisions, but only as a member of the group. The group has to go through its own process, and you, as leader, have to accept the decision it comes to.

This doesn't mean you can't argue for a different position, or that you can't refuse to participate in something you consider unethical. It's important, and is in fact your duty, to model reason and integrity. But while you shouldn't budge on integrity, your reasoning may be faulty, or may simply fail to convince others. If you make your argument forcefully, and people don't buy it, integrity dictates that you respect the process and go along with what's decided. If you're absolutely certain that the group's plan is suicidal, you can, of course, refuse to participate. But you can't force a collaborative enterprise into a path it's not willing to take.

In addition, you have to encourage ideas from all quarters, and encourage new leadership from within the group. Often, you may step aside while others assume leadership on particular issues. In some situations, it may be best for you to step aside permanently, and cede leadership entirely. The ability to do that may be the true mark of a collaborative leader.

Collaborative leadership is the leadership of a process, rather than of people. It means maintaining a process that allows for the inclusion of all stakeholders involved in an issue or organization or community effort; that depends on collaborative problem-solving and decision-making; and that is open and open-ended, with no foreordained conclusions. It is particularly valuable in situations where "no one is in charge," where the size and complexity of problems make it impossible for any individual or organization alone to effect change.

Collaborative leadership encourages ownership of the collaborative enterprise, builds trust and minimizes turf issues, allows for more and better information, leads to better and more effective solutions, encourages new leadership from within the collaboration, empowers the group or community, and can change the way a whole community operates. It can also take inordinate amounts of time, and requires that leaders deal with conflict and resistance to the collaborative process, bite their tongues as the group moves in directions they don't agree with, and subordinate their egos to the process of the group.

In general, the advantages far outweigh the disadvantages, but not in every situation. The best times for collaborative leadership are when the timing is right; when complex and serious problems arise; when stakeholders are characterized by diversity and/or a variety of interests; when other solutions haven't worked; when an issue affects a whole organization or community; or when empowerment is a goal of the process from the beginning.

While collaborative leaders may come from anywhere, they usually have in common community credibility; the ability to relate comfortably to everyone in the community; good facilitation skills; the ability to be catalysts; a commitment to the collaborative process; and a commitment to the common good, rather than to narrow interests.

To be a good collaborative leader, you have to lead, maintain, and safeguard the collaborative process; understand and use the leadership context (the community and the nature of the problem you're facing); be a motivator with a firm footing in reality; be flexible in your dealings with people and inflexible in your defense of the inclusiveness, openness, and collaborative nature of the process; and leave your ego needs at home. If you can do all that, the chances are good that your collaborative effort will succeed.

Online Resources

Chapter 10: Empowerment in the "Introduction to Community Psychology" addressed the different levels of empowerment, how to contribute to power redistribution, and ways to take action to make changes in communities.

Critical Issues in Leadership from the  North Central Regional Educational Laboratory  (NCREL).

Print Resources

Bryson, M., & Crosby, B. (1992).   Leadership for the Common Good: Tackling Public Problems in a Shared-Power World . San Francisco: Jossey-Bass Publishers.

Chrislip, D., & Carl, E. (1994).  Collaborative Leadership: How Citizens and Civic Leaders Can Make a Difference . San Francisco: Jossey-Bass Publishers.

Herman, D. (ed.) (1994).  The Jossey-Bass Handbook of Nonprofit Leadership and Management . San Francisco: Jossey-Bass Publishers. 

Siefer, H.  Leadership Ensemble: Lessons in Collaborative Management from the World's Only Conductorless Orchestra, Times Books/Henry Holt, 2001. A book detailing the collaborative leadership style of the Orpheus Chamber Orchestra, which rehearses and performs without a conductor. Includes Orpheus's eight guidelines for collaborative leadership.

  • Technical Support
  • Find My Rep

You are here

Collaborative Problem Solving

Collaborative Problem Solving A Step-by-Step Guide for School Leaders

  • Lawrence A. Machi - University of La Verne, USA
  • Brenda T. McEvoy - Independent Writer/ Researcher
  • Description

Engage your school communities in collaboratively solving your biggest problems

Schools are complex places where problems come in all shapes and sizes, and where decisions impact students’ lives. Leading groups in solving these problems sometimes can be a daunting task. Collaborative Problem Solving outlines a process to help veteran and new leaders alike to create thoughtful, organized, and collaborative solutions for the simple to the most difficult problems they face.

Rooted in theory, this comprehensive guide presents a seven-step process that addresses all types of problems. Each chapter outlines the tasks and procedures required to successfully navigate each step, while providing helpful analogies and illustrations, alongside common foibles and fumbles leaders should avoid. Additional features include:

  • An explanation of participatory problem-solving
  • Prerequisites for successful collaboration and rules for collaborative leaders
  • “Task Cue Cards” that offer facilitation lesson plans to approach each step in the process
  • A “Problem Solver’s Toolbox” that covers meeting designs, roles, communication strategies, and more
  • An annotated guide for further reading, providing a wealth of additional information and resources

Practical and relevant, this book is a user-friendly manual for school leaders seeking to employ a problem-solving process that works so that they and their teams can feel confident their efforts will result in a successful resolution.

See what’s new to this edition by selecting the Features tab on this page. Should you need additional information or have questions regarding the HEOA information provided for this title, including what is new to this edition, please email [email protected] . Please include your name, contact information, and the name of the title for which you would like more information. For information on the HEOA, please go to http://ed.gov/policy/highered/leg/hea08/index.html .

For assistance with your order: Please email us at [email protected] or connect with your SAGE representative.

SAGE 2455 Teller Road Thousand Oaks, CA 91320 www.sagepub.com

Preview this book

For instructors, select a purchasing option.

logo

  • Counseling & Therapy Services
  • Residential Center for Healing & Resilience
  • Birth Parents
  • Adoptive Parents
  • Suicide Prevention
  • YOUTHWORKS!
  • Planned Giving
  • Appreciated Assets
  • Events & Galas
  • Careers & Volunteering
  • Leading Edge for Youth
  • For Employees

collaborative problem solving process

Collaborative Problem Solving: A Resource Guide for Counselors Addressing Family Issues

Idaho Youth Ranch

Idaho Youth Ranch

Subscribe to our blog

Collaborative Problem Solving (CPS) is an evidence-based approach that focuses on understanding and addressing the root causes of challenging behavior in children and adolescents. Developed by Dr. Ross Greene, CPS aims to foster empathy, communication, and collaboration between parents, children, and professionals, ultimately leading to more effective and lasting solutions for family issues. This resource guide provides an overview of the CPS model, outlines the key principles and steps involved, and offers practical tips and strategies for counselors working with families.  

The Collaborative Problem Solving Model 

1. Understanding the CPS Philosophy

CPS is grounded in the belief that children do well if they can. The approach posits that challenging behavior is not due to a lack of motivation, attention-seeking, or manipulation but rather a result of lagging skills and unsolved problems. By understanding and addressing these underlying factors, counselors can help families develop more effective, compassionate, and sustainable solutions.  

2. Key Principles of CPS 

Empathy: The foundation of the CPS model is empathic understanding, which involves recognizing and validating the feelings and perspectives of all family members.  

Collaboration: CPS emphasizes the importance of working together rather than relying on unilateral decision-making or power-based approaches.  

Skill-building: The approach focuses on identifying and addressing lagging skills, such as emotion regulation, problem-solving, and communication, to promote lasting change.  

Implementing the Collaborative Problem Solving Process 

The first step in the CPS process is to identify the specific skills that a child may be struggling with. This can be done through a combination of observation, interviews, and assessments. Some common lagging skills include:  

Emotional regulation  

Flexibility  

Impulse control  

Problem-solving  

Communication  

Once lagging skills have been identified, the next step is to determine the specific situations or problems that are causing difficulties for the child and family. Unsolved problems are often characterized by predictability and can be uncovered through discussions with family members and the child.  

3. The Three Steps of Collaborative Problem Solving

The CPS process involves three primary steps, which can be adapted and tailored to the unique needs and circumstances of each family.  

  • Step 1: Empathy

Begin by gathering information and understanding the child’s perspective on the problem. This step involves active listening, validating emotions, and demonstrating genuine curiosity.  

  • Step 2: Define Adult Concerns

Clearly articulate the parent or caregiver’s concerns and needs regarding the situation. This step promotes mutual understanding and acknowledges the importance of addressing both the child’s and the adult’s concerns.  

  • Step 3: Invitation to Collaborate 

Invite the child and parent to brainstorm possible solutions together. Encourage them to consider a range of ideas and evaluate each option based on its feasibility and effectiveness in addressing both the child’s and the adult’s concerns.  

Tips and Strategies for Counselors 

1. Build Rapport and Establish Trust 

Establishing a strong therapeutic alliance with both the child and the parent is essential for the success of CPS. Be patient, empathetic, and transparent in your approach in order to foster trust and cooperation.  

2. Use Reflective Listening and Validation 

Active listening and validation are crucial tools in the CPS process. Reflect back the emotions and concerns of family members to ensure they feel heard and understood.  

3. Encourage Open Communication 

Create a safe and non-judgmental environment that encourages open communication and allows family members to express their thoughts, feelings, and concerns without fear of criticism or rejection.  

4. Be Flexible and Adaptable 

Each family is unique, and the CPS process may need to be adapted to suit their specific needs and circumstances. Be prepared to modify your approach, pacing, and techniques as needed to best support the family.  

5. Provide Support and Guidance  

As a counselor, your role is to facilitate the CPS process and provide guidance and support to the family throughout. Offer suggestions, ask probing questions, and share relevant resources to help family members develop their problem-solving skills.  

6. Monitor Progress and Adjust 

Regularly assess the family’s progress and the effectiveness of the solutions they’ve implemented. Be prepared to revisit and adjust the problem-solving process as needed, based on the family’s evolving needs and circumstances.  

7. Encourage Skill-Building 

As part of the CPS process, help family members develop and practice the skills necessary to address their unsolved problems effectively. This may include offering resources, psychoeducation, or skill-building exercises to support growth in areas such as emotion regulation, communication, and flexibility.  

Collaborative Problem Solving offers a compassionate and effective approach to addressing challenging behaviors and family issues. By understanding the underlying causes of these difficulties and engaging in a collaborative, empathic problem-solving process, counselors can help families develop lasting solutions and strengthen their relationships. By following the principles and steps outlined in this resource guide and adapting your approach to meet the unique needs of each family, you can support families in achieving positive, sustainable change.  

Share this article

Leave a Comment

Read more blogs.

collaborative problem solving process

Creating Holiday Memories that Build Resilience in Adolescents: A Guide for Parents from Idaho Youth Ranch

The holiday season is a special time for families to create lasting...

collaborative problem solving process

Teen Suicide Prevention: Warning Signs to Watch Out For

If you’re worried your teen is suicidal, reach out to the following 24/7...

collaborative problem solving process

Unlocking Resilience: How Understanding Your Brain Can Help You Navigate Your Child’s Mental Health Journey

Parenting is a journey filled with joys, challenges, and unexpected turns....

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List
  • Front Psychol

Collaborative Problem Solving: Processing Actions, Time, and Performance

Paul de boeck.

1 Department of Psychology, The Ohio State University, Columbus, OH, United States

2 Department of Psychology, KU Leuven, Leuven, Belgium

Kathleen Scalise

3 Department of Educational Methodology, Policy, and Leadership, University of Oregon, Eugene, OR, United States

Associated Data

This study is based on one collaborative problem solving task from an international assessment: the Xandar task. It was developed and delivered by the Organization for Economic Co-operation and Development Program for International Student Assessment (OECD PISA) 2015. We have investigated the relationship of problem solving performance with invested time and number of actions in collaborative episodes for the four parts of the Xandar task. The parts require the respondent to collaboratively plan a process for problem solving, implement the process, reach a solution, and evaluate the solution (For a full description, see the Materials and Methods section, “Parts of the Xandar Task.”) Examples of an action include posting to a chat log, accessing a shared resource, or conducting a search on a map tool. Actions taken in each part of the task were identified by PISA and recorded in the data set numerically. A confirmatory factor analysis (CFA) model looks at two types of relationship: at the level of latent variables (the factors) and at extra dependencies, which here are direct effects and correlated residuals (independent of the factors). The model, which is well-fitting, has three latent variables: actions (A), times (T), and level of performance (P). Evidence for the uni-dimensionality of performance level is also found in a separate analysis of the binary items. On the whole for the entire task, participants with more activities are less successful and faster, based on the United States data set employed in the analysis. By contrast, successful participants take more time. By task part, the model also investigates relationships between activities, time, and performance level within the parts. This was done because one can expect dependencies within parts of such a complex task. Results indicate some general and some specific relationships within the parts, see the full manuscript for more detail. We conclude with a discussion of what the investigated relationships may reveal. We also describe why such investigations may be important to consider when preparing students for improved skills in collaborative problem solving, considered a key aspect of successful 21st century skills in the workplace and in everyday life in many countries.

Introduction

The construct explored here, collaborative problem solving (CPS), was first introduced to the Program for International Student Assessment (PISA) in 2015. Attempts to explore process data collected in complex activities such as CPS are emerging rapidly in education. Yet which models might best fit process data and the analytic techniques to employ to investigate patterns in the data are not well understood at this time. So here we investigate whether relationships seen in the actions taken by PISA respondents, as coded by PISA, might shed light on approaches for modeling complex CPS tasks.

In the CPS task released by PISA, the Xandar task, there are four parts. The parts of the task require the respondent to collaborate to plan a process for problem solving, implement the process, reach a solution, and evaluate the solution. (For a full description of these parts, see the Materials and Methods section, “Parts of the Xandar Task.”) Examples of actions in Part 1, for instance, include posting to a chat log, accessing a shared resource, or conducting a search on a shared map tool.

In each of the parts, process data are available on time spent and number of actions, as well as on the performance on specific items within the four parts. We explore modeling these Xandar data to address three research questions:

  • simple RQ1. Does a factor model employing process data (actions and time) support evidence for a latent variable differentiation between the types of process data (actions, time) and between the latter two and quality of performance? The expected latent variables are Actions, Time, and Performance.
  • simple RQ2. Do extra dependencies at the level of the observed variables improve model fit, including direct effects and correlated residuals (independent of the factors)? If they do, they reveal direct relationships between process aspects and performance, independent of the latent variables. These direct relationships are indications of the dynamics underlying collaborative problem solving, whereas the latent variables and their correlations inform us about global individual differences in process approaches and performance.
  • simple RQ3. Can the performance also be considered as uni-dimensional at the specific level of the individual items (from all four Xandar parts)?

In this Xandar investigation, each factor (latent variable) is composed of four corresponding measures from the four Xandar parts. Data are fit with a latent variable model to answer RQ1. Dependencies within parts can be expected between the three measures. So we address the extra dependencies in RQ2. The dependencies are not only considered for methodological reasons when variables stem from the same part, but they may also reveal how subjects work on the tasks. Finally, because a good-fitting factor model would imply uni-dimensionality of the performance sum scores from the four parts, we also explore uni-dimensionality at the level of the individual items in RQ3.

Sections in this paper first discuss the PISA efforts to explore problem solving in 2012 and 2015 assessments, then offer a brief summary of the literature on CPS. Next in the Materials and Methods section, we discuss the PISA 2015 collaborative complex problem solving released task, “Xandar,” including the availability of the released code dictionary and data set. In the Results and Discussion, we model United States data from the Xandar task and report results to address the three research questions.

PISA and a Brief Summary of Literature on CPS

The PISA 2015 CPS construct, which included measuring groups in collaboration, was built on PISA’s 2012 conception of individual problem solving ( OECD, 2014 ). In PISA 2012, some student individual characteristics related to individual problem solving were measured. These measures were openness to learning, perseverance, and problem solving strategies.

For the 2015 PISA collaborative framework ( OECD, 2013 ), the construct of problem solving was extended from 2012 in order to include measures of group collaboration. For this new assessment in 2015, it was recognized that the ability of an individual to be successful in many modern situations involves participating in a group. Collaboration was intended to include such challenges as communicating within the group, managing conflict, organizing a group, and building consensus, as well as managing progress on a successful solution.

The PISA framework described the importance of improving collaboration skills for students ( Rummel and Spada, 2005 ; Vogel et al., 2016 ) The measurement of collaboration skills was at the heart of problem solving competencies in the PISA CPS 2015 framework. The framework specified first that the competency being described remained the capacity of an individual, not the group. Secondly, the respondent must effectively engage in a process whereby two or more agents attempt to solve a problem, where the agents can be people or simulations. Finally, the collaborators had to show efficacy by sharing the understanding and effort required to come to a solution, such as pooling knowledge to reach solutions.

Approaches to gathering assessment evidence cited by the PISACPS framework ( OECD, 2013 ) ranged from allowing actions during collaboration to evaluating the results from collaboration. Measures of collaboration in the research literature include solution success, as well as processes during the collaboration ( Avouris et al., 2003 ). In situ observables for such assessments could include analyses of log files in which the computer keeps a record of student activities, sets of intermediate results, and paths taken along the way ( Adejumo et al., 2008 ). Group interactions also offer relevant information ( O’Neil et al., 1997 ), including quality and type of communication ( Cooke et al., 2003 ; Foltz and Martin, 2008 ; Graesser et al., 2008 ) and judgments ( McDaniel et al., 2001 ).

The international Assessment and Teaching for twenty-first century Skills (ATC21S) project also examined the literature on disposition to collaboration and to problem solving in online environments. ATC21S described how interface design feature issues and the evaluation of CPS processes interact in the online collaboration setting ( Scalise and Binkley, 2009 ; Binkley et al., 2010 , 2012 ).

In the PISA 2015 CPS assessment, a student’s collaborative problem-solving ability is assessed in scenarios where the student must solve a problem. For collaboration, the problem is solving working with “agents,” or computer avatars that simulate collaboration. The CPS framework describes that a problem need not be subject-matter specific task,. Rather it could also be as a partial task in an everyday problem. Examples of subject-matter specific problem solving include setting up a sustainable fish farm in science, planning the construction of a bridge using engineering and mathematics, or writing a persuasive letter using language arts and literacy Examples of an “everyday” problem include communicating with others to delegate roles during collaboration for event planning, monitoring to ensure a group remains on task, and evaluating whether collaboration is complete. All these actions can be directed toward the ultimate goal.

In the PISA 2015 perspective, assessment is continuous throughout the unit and can incorporate student’s interactions with the digital agents. Each student response on a traditional question follows a stream of actions during which the student has chosen how to interact and collaborate with standardized agents in each particular task situation. Very few of the collaborative actions and tasks are released by PISA, but the number of collaborative actions in each part of the task are released and made available in the PISA data sets. So here we accept that PISA has coded the action as taking place, and analyze the numeric results provided.

Materials and Methods

Parts of the xandar task.

Here we analyze numeric data provided for the PISA 2015 Xandar unit ( OECD, 2017a , 2017b ). In the unit Xandar:

“A three-person team consisting of the student test-taker and two computer agents takes part in a contest where [the team] must answer questions about the fictional country of Xandar. The questions [involve] Xandar’s geography, people and economy. This unit involves decision-making and coordination tasks, requires consensus-building collaboration, and has an in-school, private, and non-technology-based context.”

Xandar is a fictional planet appearing in comic books published by Marvel Comics. In the PISA Xandar task, it is treated as a mythical location to be investigated collaboratively. The Xandar task has four parts:

  • • Part 1 – Agreeing on a Strategy. This part of the Xandar activity familiarizes the student with how the contest will proceed, the chat interface and the task space including buttons that students can click to take actions in particular situations and a scorecard that monitors team progress. In Part 1, the student is assigned to work in a team with digital agents named Alice and Zach. A variety of actions are available. The respondent and the agents interact to generate a stream of actions. The respondent is expected to follow the rules of engagement provided for the contest and to effectively establish collaborative and problem-solving strategies that were the goal of Part 1.
  • • Part 2 – Reaching a Consensus Regarding Preferences. In this part of the Xandar activity, group members should take responsibility for the contest questions in one subject area (Xandar’s geography, people, or economy). The team members must apportion the subject areas among themselves. The agents begin by disagreeing. The student has opportunities to help resolve the disagreement, can take a variety of actions, and the goal is to establish common understanding.
  • • Part 3 – Playing the Game Effectively. In this part of the Xandar activity, group members begin playing the game by answering geography contest questions together. The group has the opportunity to choose among answers, during which the agents interject questions, pose concerns and even violate game rules. The student exhibits collaborative problem solving strategies through actions and responses.
  • • Part 4 – Assessing Progress. In this part of the Xandar activity, agent Alice has posed a question about its progress. The student responds with an evaluation. Regardless of the student’s answer, agent Zach indicates he is experiencing trouble information foraging for his assigned subject area, economy. Responses and actions take place regarding both evaluating and supporting group members.

Each of the four parts comes with a number of items to score the performance. The complete Xandar released task is presented in an OECD PISA report that illustrates the items that students faced in the 2015 PISA collaborative problem-solving assessment ( OECD, 2016 ). The released code dictionary and data are also available on the 2015 PISA website. We do not repeat the Xandar information here (due in part to copyright), but summarize only. The Xandar released unit presents:

  • • a screenshot of each item
  • • the correct action(s) or response to the item
  • • an explanation as to why the action or response is correct
  • • the skills that are examined by the item
  • • alignments describing the difficulty of the item.

As described earlier, this study employed data publicly released from the Organization for Economic Co-operation and Development Program for International Student Assessment (OECD PISA) for the optional collaborative problem solving (CPS) assessment. It was administered in 2015 to nationally representative samples of approximately age 15 students. Since PISA is designed to have systematically missing data in a matrix sample, only students who took the Xandar task were included. Students were sampled according to the PISA sample frame. Data analyzed here are representatively sampled United States participants from the Xandar released task. See Table 1 for descriptives by age, gender and race/ethnicity of the United States Xandar task sample used.

Descriptives for collaborative problem solving Xandar assessment for the United States sample.

From the 994 students who took the Xandar task, 986 have complete Xandar data. The descriptive statistics and all analyses are based on N = 986. (Note that limitations to be discussed later in this manuscript include only United States data examined to date in this exploration. Extensions to more countries and comparisons across countries are an exciting and interesting potential to the work. However, the international extensions are out of scope for this article.) For the purposes of the current study, the school variable was not employed. All students were treated as one group.

Regarding ethical approval and consent for human subjects data collection in PISA, OECD gains ethical approval and consent through PISA processes. Processes are established in coordination with each country for public release of some de-identified data collected in PISA main study assessments. Data sets made available for release are intended for purposes of secondary research. The CPS data set used here is available through the OECD data repository website 1 .

As discussed earlier, for the Xandar task, released data are available for actions, time and level of performance. The data for the current study included four indicators each of CPS actions taken (parts 1–4), time taken (parts 1–4), and success scores (parts 1–4). These become the three latent traits, or factors, in this study. To measure CPS actions, we used number of collaboration actions as measured by the data provided in the log transformation of C1A, C2A, C3A, and C4A. “C” indicates this was a collaborative assessment, the numeral indicates the Xandar part, and “A” indicates number of actions taken. To measure timing, we used timing as measured by data provided in the log transformation of C1T, C2T, C3T, and C4T. “C” indicates this was a collaborative assessment, the numeral indicates the Xandar part, and T indicates time taken. To measure student success, we used the sum of the binary item response success scores for each of the four parts, C1P, C2P, C3P, and CP4 (based on 5, 3, 2, and 2 items within the Xandar parts).

Exploratory data analysis following log transformation as described above for some variables revealed only minor deviations from normality. Skewness between −2 to 2 was used for all observed variables ( Cohen et al., 2002 ). Note, however, that this is not a strongly conservative range, as discussed in the limitations. So we also report for this study skewness with all observed variables approximately in the range −1 to 1 except for C1A (1.52) and C2A (1.48). Due to no major levels of deviation, the analysis proceeded without further transformation to the observed variables. Other descriptives for all observed variables are provided in Table 2 .

Descriptives for observed variables.

We fit the model using lavaan ( Rosseel, 2012 ) in R version 3.5.1 ( R Core Team, 2018 ). We used the weighted least squares “WLSMV” option which employs the diagonally weighted least squares (DWLS) estimator with robust standard errors and a mean and variance adjusted test statistic. We have estimated a confirmatory factor analysis (CFA) model with three factors (each with standardized latent variables). The factors are Actions, Time, and Performance. Each one has the four corresponding measures from the four Xandar parts.

Because dependencies within parts can be expected between the three measures, some parameters were added to the model. They are direct within-part effects of actions on time (more actions implies more time), direct within-part effects of performance on time (better performance may take more time), and correlated residuals for actions and performance within each part (exploring the relationship between actions and performance level).

Direct effects and residual correlations are two different types of dependencies. Direct effects are effects of one variable on another (e.g., of Y 1 on Y 2 ). The two directions, Y 1 → Y 2 and Y 2 → Y 1 , are not mathematically equivalent. Correlated residuals are equivalent with the effect of a residual of one variable on the other variable (e.g., of ε Y ⁢ 1 on Y 2 ). the two directions are mathematically equivalent and equivalent with the covariance of the residuals. To be clear, neither of the dependencies prove a causality relation. A causal hypothesis can be at the basis of hypothesizing a direct effect, whereas correlated residuals can be used for explorative purposes, without specifying a direction. For the present study, we hypothesized that more actions take more time and that a higher level of performance requires more time. For number of actions and level of performance we explore the dependency with correlated residuals.

See the row heads of Tables 3 , ​ ,4 4 and Figure 1 for a definition of the model estimated. It includes the latent variable structure as well as the dependencies. The model can also be derived from the R code for the analysis, which is available in the Supplementary Material .

CFA factor loadings Xandar measures.

Extra dependencies in CFA model for Xandar measures.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-10-01280-g001.jpg

Latent variable and dependency model for Xandar data. The latent variables are Time, Actions, and Performance. The observed variables per factor are indicated with capital letters referring to the latent variable (T, A, P) and with a number referring to the Xandar part (1, 2, 3, and 4). The direct effects between observed variables from the same Xandar part are indicated with single headed dashed arrows (between the A and T and between the P and T). The correlated residuals are indicated with dotted lines without arrow. Significance ( p <–01) is denoted with a thicker dashed arrow (direct effects) or line (correlated residuals). All dependencies are positive except when indicated with “neg” (between Al and PI). Correlations between latent variables, factor loadings, residual variances, and dependency values are omitted to avoid clutter in the figure. The correlations between the latent variables can be found in the text, the factor loadings are presented in Table 3 , and the dependency values in Table 4 .

In this section we describe the results of the modeling. With the dependencies as described in the Methods section added to the model, the model fit was good (close), with a TLI of 0.95 and RMSEA of 0.038 (90% CI 0.029 to 0.048). Without the dependencies (without the eight direct effects and four residual correlations), the model fit is clearly worse, with a TLI of 0.574 and RMSEA of 0.112 (90% CI 0.104 to 0.119). These results address RQ1 and RQ2.

The correlations between the latent variables are −0.473, p <0.001 (Actions and Time), −0.732, p < 0.001 (Actions and Performance), and 0.190, p < 0.01 (Time and Performance). The loadings and dependencies are shown in Tables 3 , ​ ,4, 4 , respectively. As expected, the indicators of actions, time, and performance all showed significant positive factor loadings on the corresponding factors (see Table 3 ). The standardized coefficients in the last column indicate that the loadings of the Part 4 indicators are lower than those of the other three parts: 0.19 (Actions), 0.43 (Time), and 0.38 (Performance).

Table 4 shows the estimates of the dependencies:

  • • Number of activities makes time longer: a significant positive effect was found for all four parts.
  • • A significant positive effect of performance on time was found only for Part 4. For the other parts the effect was almost zero.
  • • Number of activities and performance levels have significant correlated residuals for two parts. For explorative reasons the dependencies were not tested with a direction but with correlated residuals instead. The results were found to be different depending on the part. Results showed negative dependency for Part 1, a positive dependency for Part 4, and an almost zero dependency for the Parts 2 and 3.

Although the factor model with these dependencies fits well, we wanted to check whether the performance is also uni-dimensional at the level of the individual items (RQ3). Uni-dimensionality of the four sum scores as implied by the factor model, does not imply uni-dimensionality at the level of the 12 individual binary items. This is especially because the items represent four processes (exploring and understanding, representing and formulating, planning and executing, and monitoring and reflecting) and three competencies (establishing and maintaining shared understanding, taking appropriate action to solve the problem, and establishing and maintaining team organization), but not with a perfectly crossed design.

The answer to the dimensionality question based on the analysis with this data set is that the 12 items can be considered as uni-dimensional based on the empirical data, although they are designed to tap on a diversity of processes and competencies. The uni-dimensional model fit was good (close), with a TLI of 0.94 and RMSEA of 0.037 (90% CI 0.029, 0.046). The uni-dimensional model is the result of an ordinal confirmatory factor model for the binary items using WLSMV and the same lavaan version as for the earlier analysis. For the delta parameterization the loadings vary between 0.272 and 0.776 and they are all significant ( p < 0.001).

For the model with loadings and dependencies showing in Tables 3 , ​ ,4, 4 , the latent variable correlations of Actions with Time and with Performance are negative. Hence, participants showing more activities are faster and perform less well in their collaborative problem solving. This is based on the United States dataset with the Xandar task. Successful participants take more time, perhaps a consequence of the previous two relationships. Multiplying the two negative correlations yields −0.473 × −0.723 = 0.346, which is higher than the 0.190 estimate of the correlation between Time and Performance. This explains that in an alternative but formally equivalent model with an effect of Actions on Time and on Performance, the correlation between the residuals of the latent variables Time and Performance is negative. However, the correlation of −0.260 in question is not significant ( p > 0.05).

The negative correlation between Actions and Time suggests that highly active students are fast and not so active students are slow. The combination of fast and active on the latent variables seem to reflect an impulsive and fast trial-and-error style. This strategy shows itself in the Xandar task as not very successful versus a slower, more thoughtful and apparently more successful style. It makes sense that respondents who are more deliberative may have more knowledge to bring to considering a successful solution, or be exhibiting more test effort in the Xandar context. We do not have the information to examine what is happening during the deliberation. This is in part because descriptions of the possible actions are not available in the data set. As well there is no interpretive information provided by PISA for the sample. This could include think-alouds where students describe why they are doing what they are doing. It could also have included qualitative response process information in which student explain their processes, in-depth interviews, or other approaches that supply interpretive information.

However, it makes of course sense that more actions take more time, which shows in the analysis of the dependencies between observed actions and time. This illustrates why it is informative to differentiate relationships between latent variables from relationships which show in dependencies.

Other important dependencies concern Part 4, which is a clearly reflective task, a kind of reflective and evaluative pause. The nature of the task may explain why performance is associated with more actions and requires more time, in contrast with Part 1 (agreeing on a strategy) where the association between actions and performance is negative. For instance, too much discussion on a strategy may signal a lack of structure.

For the result that the items examined can be considered as uni-dimensional although they are designed to tap on a diversity of processes and competencies, this suggests that the collaborative ability generalizes across processes. In other words, the collaborative competencies rely on a general underlying ability. The specificities of the processes are reflected in the extra dependencies. Part 4 involves monitoring and reflecting. This may explain why more activities and more time are associated with better performance. Part 1 by contrast involves planning and execution and representing and formulating. This may lead to better results if not based on trial and error (many actions) but on a structured and goal-oriented approach (less actions).

These dependencies suggest that, depending on the task, the collaborative ability may rely on a general underlying ability but be implemented through a different approach in various collaborative actions, as has been discussed in the literature ( Fiore et al., 2017 ; OECD, 2017b ; Eichmann et al., 2019 ). The special and specific status of Part 4 is also reflected in its lower loadings on all three latent variables (see standardized loadings).

Note that the extra dependencies here are not only considered for methodological reasons when variables stem from the same part. They may also reveal how subjects work on the tasks. This is consistent with the findings here. Parts such as 1 and 4 have a distinct theoretical description in the PISA framework. But how they draw on the collaborative ability can be seen in the empirical data to seemingly require different approaches as indicated in the process data.

Taken together, these results for the United States data set are consistent with problem solving performance modeled as invested time and number of actions.

Potential impacts underscore that it seems possible both to collect and to scale information on the collaborative ability. Measures may help provide intervention support, since in today’s world especially, teams with good collaborative skills are necessary in any group. Groups can range from families to corporations, public institutions, organizations, and government agencies ( OECD, 2013 ). Previously, dispositions to collaborate were reported based on the PISA data ( Scalise et al., 2016 ). Indicators of collaborative ability also may be needed to create adequate interventions to train collaboration skills and to change current levels of individual collaboration.

As previously reported, the disposition dimensions of collaborate , negotiate , and advocate / guide might be useful starting points for creating such interventions ( Scalise et al., 2016 ; OECD, 2017a ). Alternatively, the factor structure here may yield suggestions on additional interesting starting points. This could include structures by which a student may approach collaboration ( OECD, 2017b ; Wilson et al., 2017 ) but more interpretive information would be needed. This could be combined with how participatory a student is disposed to be in collaboration, along with his or her team leadership inclinations, and beliefs in the value or efficacy of collaboration ( Scalise et al., 2016 ).

Limitations to the analysis here include that only the United States data set of many countries available in the PISA data was analyzed. So this analysis should be extended to more countries and results compared in future work.

Also, from a statistical standpoint as discussed earlier, missing data were excluded listwise. In addition, minor but not major skewness was seen in two of the observed variables. Finally, multilevel modeling was not employed so the nested nature of students within schools was not taken into account.

TLI and RMSEA were reported here as the two fit indices since they seem most commonly used in the educational assessment field for large scale analyses. But there have been limited considerations for CPS on this topic.

For limitations from a conceptual standpoint, OECD releases a limited range of information, for instance items for only one of the 2015 collaborative problem solving tasks (Xandar) was released and collaborative actions were numbered but not described in the data set and data dictionary.

For implications of future work from this study, there are several. First, the era of analyzing process data and not only item response data in robust assessment tasks is upon us (many researchers including Praveen and Chandra, 2017 ). Approaches such as used here could be applied for other constructs, not just problem solving. Models can consider how to explore two types of relationship:

  • • at the level of general individual differences (the factors)
  • • at extra dependencies, which are direct effects and correlated residuals (independent of the factors)

These extra dependencies may provide a window on the underlying process dynamics, see Figure 1 . It should be noted for implications for future work that it would be helpful if a range of simplified visualizations could be developed for such complex analyses. Standard plots after including dependencies seemed too complex to be fully useful.

For extensions to the specific modeling here, it would be important as discussed earlier to explore fitting the same or similar models across data sets from other countries ( Thomas and Inkson, 2017 ). This could be augmented by also modeling potential country-level effects at the item level, by exploring differential item functioning. Furthermore it would be interesting to consider covariates available in the PISA student questionnaire data set (SQ) in relation to the collaborative ability examined here. This could include indicators for dispositions for collaborative problem solving that moved forward to the main PISA study ( Scalise et al., 2016 ). These indicators include student-level indicators available in the CPS SQ data set regarding self-report of dispositions toward cooperation, guiding, and negotiating.

It should also be mentioned that other very interesting student-level indicators regarding additional preferences in collaboration had to be dropped from the PISA main study. This was due to time limitations. Dropped indicators included dispositions toward collaborative leadership , as well as student-level indicators of in-school and out-of-school collaborative opportunities . While these were not possible to include in the main study due to time limitations for the PISA administration, the indicators were part of the field testing. They could be very interesting to administer at the country-level in other national or international assessments.

Teacher-level indicators are also available in the PISA data set that provide information on opportunity to learn (OtL) for students in the PISA CPS. Data include classroom-level OtL reports of team activities, grouping practices, types of collaborative activities, and types of rewards provided for engaging in successful team work. Exploring relationships here might allow more reflection on connections to potential interventions. The PISA data are cross-sectional but might help to inform research studies within countries.

In closing, it is important to mention that the creation and delivery of the innovative PISA CPS instrument included both simulated collaboration of a hard-to-measure construct ( Scalise, 2012 ) and sharing of some process data. This was critical to the examination here, as has been the case for other collaboration-oriented assessments ( Greiff et al., 2014 , 2015 , 2016 ). This analysis underscores that addressing challenges of education in the 21st century may continue to require new data sources, to address new challenges for education worldwide.

Author Contributions

All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

1 www.oecd.org/pisa/data/

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fpsyg.2019.01280/full#supplementary-material

  • Adejumo G., Duimering R. P., Zhong Z. (2008). A balance theory approach to group problem solving. Soc. Netw. 30 83–99. 10.1016/j.socnet.2007.09.001 [ CrossRef ] [ Google Scholar ]
  • Avouris N., Dimitracopoulou A., Komis V. (2003). On evaluation of collaborative problem solving: methodological issues of interaction analysis. J. Comput. Hum. Behav. 19 147–167. [ Google Scholar ]
  • Binkley M., Erstad O., Herman J., Raizen S., Ripley M., Miller-Ricci M., et al. (2012). “ Defining twenty-first century skills ,” in Assessment and Teaching of 21st Century Skills , eds Griffin P., McGaw B., Care E. (New York, NY: Springer; ). [ Google Scholar ]
  • Binkley M., Erstad O., Herman J., Raizen S., Ripley M., Rumble M. (2010). “ Assessment and teaching of 21st century skills: defining 21st century skills ,” in White Paper released at the Learning and Technology World Forum 2010 , (London: ). [ Google Scholar ]
  • Cohen J., Cohen P., West S. G., Aiken L. S. (2002). Applied Multiple Regression/Correlationanalysis for the Behavioral Sciences , 3rd Edn Hove: Psychology Press. [ Google Scholar ]
  • Cooke N. J., Kiekel P. A., Salas E., Stout R., Bowers C., Cannon- Bowers J. (2003). Measuring team knowledge: a window to the cognitive underpinnings of team performance. Group Dyn. 7 179–219. [ Google Scholar ]
  • Eichmann B., Goldhammer F., Greiff S., Pucite L., Naumann J. (2019). The role of planning in complex problem solving. Comput. Educ. 128 1–12. 10.1016/j.compedu.2018.08.004 [ CrossRef ] [ Google Scholar ]
  • Fiore S. M., Graesser A., Greiff S., Griffin P., Gong B., Kyllonen P., et al. (2017). Collaborative Problem Solving: Considerations for the National Assessment of Educational Progress. Available at: https://nces.ed.gov/nationsreportcard/pdf/researchcenter/collaborative_problem_solving.pdf (accessed April 11, 2018). [ Google Scholar ]
  • Foltz P. W., Martin M. J. (2008). “ Automated communication analysis of teams ,” in Team Effectiveness in Complex organisations and Systems: Cross-Disciplinary Perspectives and Approaches , eds Salas E., Goodwin G. F., Burke S. (Boca Raton, FL: CRC Press; ). [ Google Scholar ]
  • Graesser A. C., Jeon M., Dufty D. (2008). Agent technologies designed to facilitate interactive knowledge construction. Discourse Process. 45 298–322. 10.1080/01638530802145395 [ CrossRef ] [ Google Scholar ]
  • Greiff S., Krkovic K., Hautamäki J. (2016). The prediction of problem solving assessed via microworlds: the relative importance of fluid reasoning and working memory. Eur. J. Psychol. Assess. 32 298–306. 10.1027/1015-5759/a000263 [ CrossRef ] [ Google Scholar ]
  • Greiff S., Wüstenberg S., Avvisati F. (2015). Computer-generated log-file analyses as a window into students’ minds? A showcase study based on the PISA 2012 assessment of problem solving. Comput. Educ. 91 92–105. 10.1016/j.compedu.2015.10.018 [ CrossRef ] [ Google Scholar ]
  • Greiff S., Wüstenberg S., Csapó B., Demetriou A., Hautamäki J., Graesser A., et al. (2014). Domain-general problem solving skills and education in the 21st century. Educ. Res. Rev. 13 74–83. 10.1016/j.edurev.2014.10.002 [ CrossRef ] [ Google Scholar ]
  • McDaniel M. A., Morgeson F. P., Finnegan E. B., Campion M. A., Braverman E. P. (2001). Use of situational judgment tests to predict job performance: a clarification of the literature. J. Appl. Psychol. 86 730–740. 10.1037/0021-9010.86.4.730 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • OECD (2013). PISA 2015: Draft Collaborative Problem Solving Framework. Available at: http://www.oecd.org/pisa/pisaproducts/Draft%20PISA%202015%20Collaborative%20Problem%20Solving%20Framework%20.pdf (accessed September 26, 2014). [ Google Scholar ]
  • OECD (2014). PISA 2012 Results: Creative Problem Solving: Students’ Skills in Tackling Real-Life Problems (Volume V). Paris: OECD. [ Google Scholar ]
  • OECD (2016). Description of the Released Unit from the 2015 PISA Collaborative Problem-Solving Assessment, Collaborative Problem-Solving Skills, and Proficiency Levels. Available at: https://www.oecd.org/pisa/test/CPS-Xandar-scoring-guide.pdf (accessed December 10, 2018). [ Google Scholar ]
  • OECD (2017a). Chapter 17: Questionnaire Design and Computer-based Questionnaire Platform. In PISA 2015 Technical Report. Available at: http://www.oecd.org/pisa/data/2015-technical-report/ (accessed September 21, 2017). [ Google Scholar ]
  • OECD (2017b). PISA 2015 Results (Volume V): Collaborative Problem Solving, PISA. Paris: OECD Publishing. [ Google Scholar ]
  • O’Neil H. F., Chung G., Brown R. (1997). “ Use of networked simulations as a context to measure team competencies ,” in Workforce readiness: Competencies and assessment , ed. O’Neil H. F. (Mahwah, NJ: Lawrence Erlbaum Associates; ), 411–452. [ Google Scholar ]
  • Praveen S., Chandra U. (2017). Influence of structured, semi-structured, unstructured data on various data models. Int. J. Sci. Eng. Res. 8 67–69. [ Google Scholar ]
  • R Core Team (2018). R: A language and environment for statistical computing. Vienna: R Core Team. [ Google Scholar ]
  • Rosseel Y. (2012). lavaan: an r package for structural equation modeling. J. Stat. Softw. 48 1–36. 10.3389/fpsyg.2014.01521 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Rummel N., Spada H. (2005). Learning to collaborate: an instructional approach to promoting collaborative problem solving in computer-mediated settings. J. Learn. Sci. 14 201–241. 10.1207/s15327809jls1402_2 [ CrossRef ] [ Google Scholar ]
  • Scalise K. (2012). “ Using technology to assess hard-to-measure constructs in the CCSS and to expand accessibility ,” in Proceedings of the Invitational Research Symposium on Technology Enhanced Assessments , (Princeton, NJ: ). [ Google Scholar ]
  • Scalise K., Binkley M. (2009). “ Transforming educational practice by transforming assessment: update on assessment & teaching of 21st century skills ,” in PISA Problem Solving 2012 , (Santa Barbara, CA: ). [ Google Scholar ]
  • Scalise K., Mustafić M., Greiff S. (2016). “ Dispositions for collaborative problem solving ,” in Assessing Context of Learning World-Wide (Methodology of Educational Measurement and Assessment Series) , eds Kuger S., Klieme E., Jude N., Kaplan D. (Dordrecht: Springer; ). [ Google Scholar ]
  • Thomas D. C., Inkson K. (2017). “ Communicating and negotiating across cultures ,” in Cultural intelligence: Surviving and Thriving in the Global Vilage , 3rd Edn, (Oakland, CA: Berrett-Koehler; ), 76–97. [ Google Scholar ]
  • Vogel F., Wecker C., Kollar I., Fischer F. (2016). Socio-cognitive scaffolding with computer-supported collaboration scripts: a meta-analysis. Educ. Psychol. Rev. 29 477–511. 10.1007/s10648-016-9361-7 [ CrossRef ] [ Google Scholar ]
  • Wilson M., Gochyyev P., Scalise K. (2017). Modeling data from collaborative assessments: learning in digital interactive social networks. J. Educ. Meas. 54 85–102. 10.1111/jedm.12134 [ CrossRef ] [ Google Scholar ]
  • Research article
  • Open access
  • Published: 04 October 2022

Understand group interaction and cognitive state in online collaborative problem solving: leveraging brain-to-brain synchrony data

  • Xu Du 1   na1 ,
  • Lizhao Zhang 2   na1 ,
  • Jui-Long Hung   ORCID: orcid.org/0000-0002-7710-7231 2 , 3 ,
  • Hengtao Tang 4 &
  • Yiqian Xie 1  

International Journal of Educational Technology in Higher Education volume  19 , Article number:  52 ( 2022 ) Cite this article

3352 Accesses

4 Citations

2 Altmetric

Metrics details

The purpose of this study aimed to analyze the process of online collaborative problem solving (CPS) via brain-to-brain synchrony (BS) at the problem-understanding and problem-solving stages. Aiming to obtain additional insights than traditional approaches (survey and observation), BS refers to the synchronization of brain activity between two or more people, as an indicator of interpersonal interaction or common attention. Thirty-six undergraduate students participated. Results indicate the problem-understanding stage showed a higher level of BS than the problem-solving stage. Moreover, the level of BS at the problem-solving stage was significantly correlated with task performance. Groups with all high CPS skill students had the highest level of BS, while some of the mixed groups could achieve the same level of BS. BS is an effective indicator of CPS to group performance and individual interaction. Implications for the online CPS design and possible supports for the process of online CPS activity are also discussed.

Introduction

Collaborative problem solving (CPS, hereafter) has become a prominent feature in 21st-century learning skills, and it is being researched across many domains (Care et al., 2012 ). CPS involves two or more people working together to solve a problem. Such capabilities have been recognized as a crucial goal in education (OECD, 2017 ). Research has indicated that the CPS skill of team members affects the effectiveness of collaboration (Andrews & Rapp, 2015 ). Groups with at least one student with high CPS skills showed significantly better learning performance (Andrews-Todd & Forsyth, 2020 ). Therefore, intensive efforts have been motivated to develop related assessments and to activate education reforms to improve the effectiveness of CPS (Stadler et al., 2020 ). In addition, education practitioners have particularly emphasized the need for building skills for remote collaborations (OECD, 2017 ; Schulze & Krumm, 2017 ), as teams have become distributed and as schooling or working from home has become the norm. Therefore, how to design, develop, and implement online CPS activities to improve the effectiveness of online CPS is one of the more important topics in current CPS research.

To better organize a CPS activity, it is necessary to understand students’ collaboration processes in CPS learning activities. Then instructors can design and develop more effective CPS activities and can provide personalized support as needed. The first step was to define different phases of the whole CPS process (Hayes, 2013 ) for advanced analyses/observations. The CPS process can be defined as having two phases: the problem-understanding phase and the solution development phase (Kwon et al., 2019 ). The problem-understanding phase involves a cognitive structure that corresponds to a problem constructed by a solver (Chi, Feltovich, et al., 1981 ). Then, in the solution development phase, students work together to develop corresponding solutions based on the collaborative cognitive structure. Therefore, group dynamics (i.e., how students interact with each other) is a critical element during the process (Chi, Glaser, et al., 1981 ). The two phases form a circular cycle and jointly influence the quality of a solution to the problem. Studies have been conducted to understand how each phase influences learning outcomes (Chang et al., 2017 ; Kwon et al., 2019 ; Zheng et al., 2020 ). However, most of the studies obtained their data through questionnaires or observations. There is a need for additional in-depth analytic results, not from the perceptual data, to understand the details of individual CPS phases, especially from the aspect of the dynamics of the group members.

The development of emerging technologies opens new possibilities to collect and analyze students’ behaviors and interactions without interfering in the learning process (Chanel & Muhl, 2015 ). Physiological data, such as electrodermal activity (EDA, hereafter), heart rate, gesture, body pose, and electroencephalogram (EEG, hereafter), reflect personal physical and psychological states (Cukurova et al., 2020 ; Sharma & Giannakos, 2020 ). Such data have been adapted to make up for some gaps in perceptional data analysis (Ashwin & Guddeti, 2020 ; Dikker et al., 2017 ; Noroozi et al., 2020 ). Physiological synchrony (PS, hereafter) is one of the analytic approaches used to obtain insights from physiological data. Studies for years in psychophysiology indicated that human cognition cannot be separated from the body (Critchley et al., 2013 ). The connection is bidirectional between a person’s mental states and his/her physiological signals (Critchley & Garfinkel, 2018 ; Pecchinenda, 1996 ). PS is related to learners’ beliefs about their cognitions, motivations, emotions, and behaviors (Haataja et al., 2018 ). The level of PS has been adopted to measure whether the interaction is effective (Dindar, Malmberg, et al., 2020 ; Sobocinski et al., 2021 ). As CPS is rooted in the social constructivist view of learning, which assumes that in-depth learning occurs when students engage in building a shared understanding of a problem through social interactions (Jermann & Dillenbourg, 2008 ; Pear & Crone-Todd, 2002 ), PS can serve as an indicator of effective interaction in the process of collaborative problem-solving (Dindar, Järvelä, et al., 2020 ; Sobocinski et al., 2021 ).

Among the PS instruments mentioned above, brain wave synchrony (or so-called brain-to-brain synchrony, BS) has its advantages in observing CPS processes. Compared with other PS signals, such as EDA and heart rate, BS can reflect students' cognitive state more accurately (Stuldreher et al., 2020a , 2020b ). Since collaborative learning involves a high level of interactivity (Davidesco, 2020 ), BS serves as a more in-depth analytical indicator that reflects interpersonal interaction or common attention from a cognitive state (Nam et al., 2020 ). Studies have been published that reveal the positive correlation between BS and the level of engagement (Bevilacqua et al., 2019 ; Dikker et al., 2017 ) and between BS and academic performance (Davidesco et al., 2019 ). However, these studies have been limited to individual students; they have not focused on collaborative learning. In addition, these studies analyzed BS through original EEG signals. Since EEG signals can be divided into specific ranges through frequency (Alarcao & Fonseca, 2019 ), analyzing the BS from different bands is a helpful method to further discover BS characteristics in collaborative learning.

Based on the literature, different CPS phases play different roles in the process of collaboration. In addition, developing a good solution heavily relies on individual students’ domain knowledge and CPS skills. BS is an effective indicator of interaction quality, which is proven to have a positive correlation with learning performance in lecturing but is rarely studied in CPS. Therefore, the purpose of this study is to analyze the characteristics of BS in CPS to find a new effective indicator, and understand the collaboration process at different CPS phases and how CPS skills impact the collaboration process from the aspect of BS. The research questions include:

What are the characteristics of BS in online CPS?

What are the differences in BS between collaborative groups with different CPS skills?

What are the differences in group interaction and cognition between groups with different BS levels?

Related works

Collaborative problem solving (cps).

Collaborative problem solving is defined as “ the capacity of an individual to effectively engage in a process whereby two or more agents attempt to solve a problem by sharing the understanding and effort required to come to a solution and pooling their knowledge, skills, and efforts to reach that solution ” (OECD, 2017 , p. 26). (OECD, 2017 ) developed the PISA survey, which aims to investigate whether a student has acquired the key knowledge and skills for full participation in modern societies near the end of his or her compulsory education. Overall, twelve CPS skills are assessed in the PISA 2015 survey (Herborn et al., 2020 ). Three social related skills are found to be important in each of the CPS phases, including (1) establishing and maintaining a shared understanding, (2) taking appropriate action to solve the problem, and (3) establishing and maintaining team organization. These skills can help the student better solve a problem collaboratively. Andrews-Todd and Forsyth ( 2020 ) proposed a method to evaluate CPS skills based on interactive content and analyzed the performance of collaborative groups with different combinations of CPS skills. The authors found that groups with at least one student with high CPS skills showed significantly better learning performance. This is consistent with findings in areas such as cooperative learning that show that beneficial cooperative behaviors exhibited by team members contribute to team success (Barron, 2003 ).

To better organize a CPS activity, it is necessary to understand students’ collaboration processes in CPS learning activities. Scholars have tried to define different phases of the whole CPS process and to map the required skills in individual phases. For example, (OECD, 2017 ) defined the CPS process, as starting with (A) exploring and understanding and then moving on to (B) representing and formulating, (C) planning and executing, and (D) monitoring and reflecting. Hayes ( 2013 ) categorized the problem-solving process into six phases: finding the problem, representing the problem, planning the solution, carrying out the plan, evaluating the solution, and consolidating gains. Although these definitions have minor differences, the whole process consists of two major phases: problem-understanding and solution development (Kwon et al., 2019 ). The quality of a solution is strongly influenced by the problem-understanding phase (Simon & Hayes, 1976 ) which has been referred to as ‘‘ a cognitive structure corresponding to a problem, constructed by a solver based on his domain-related knowledge and its organization ’’ (Chi, Feltovich, et al., 1981 ). Then students work together to develop corresponding solutions based on the collaborative cognitive structure in the solution development phase. Therefore, group dynamics (i.e., how students interact with each other) is the critical element during the process (Chi, Glaser, et al., 1981 ). Research efforts have been launched to understand the influencing factors, the quality of learning outcomes, and the collaboration patterns during these two phases. For example, Chi, Glaser, et al. ( 1981 ) found that there are considerable differences between novices and experts in problem-solving. Novices will stick to the problem definition or problem-understanding as they work on a solution, whereas experts will move forward toward solution development. Kwon et al. ( 2019 ) found that solution-oriented students gained more domain knowledge than problem-oriented students. The authors believe that students’ focus more on the problem-solving process rather than on the problem-understanding process is more conducive to the improvement of academic performance. Zheng et al. ( 2020 ) coded the online collaboration behaviors of students in their study, used the Apriori algorithm to find the high-frequency jump relationship between CPS behaviors, and analyzed the collaboration patterns of students with different academic performances. Their results show that, at the problem-solving stage, the group with high scores repeatedly modified and improved the solution, while the group with low scores seldom modified the possible solution after it was proposed.

Literature shows that most CPS studies have obtained data through questionnaires or observations. CPS is rooted in the social constructivist view of learning, which asserts that in-depth learning occurs when students engage in building a shared understanding of a problem through social interactions (Jermann & Dillenbourg, 2008 ; Pear & Crone-Todd, 2002 ). More analytic results culled not merely from perceptual data, are needed to understand the details of the individual CPS phases, especially from the aspect of group members’ dynamics and the mutual effects of two phases on the quality of CPS outcomes.

Physiological synchrony (PS) and brain-to-brain synchrony (BS)

The development of emerging technologies opens new possibilities in collecting and analyzing students’ behaviors and interactions without interfering in the learning process (Chanel & Muhl, 2015 ). Physiological data, such as EDA, heart rate, gesture, body pose, and EEG, reflect the personal physical and/or psychological states of a person (Cukurova et al., 2020 ; Sharma & Giannakos, 2020 ). Such data have been adapted to make up for some of the gaps in perceptional data analysis (Ashwin & Guddeti, 2020 ; Dikker et al., 2017 ; Noroozi et al., 2020 ). PS is one of the analytic approaches used to obtain insights from physiological data. Studies for years in psychophysiology indicated that human cognition cannot be separated from the body (Critchley et al., 2013 ). This connection is bidirectional, many of the mental states are reflected in the body’s physiological signals (Pecchinenda, 1996 ). On the other hand, the physiology of the body influences human consciousness and cognition (Critchley & Garfinkel, 2018 ). PS refers to the interdependence of, or the associated activity between, the physiological signals of collaborating individuals. It is an unintentional and spontaneous phenomenon that can be indexed through measures of the human autonomic nervous system (Palumbo et al., 2017 ). PS appears when there are the same attention objects or when there is effective interaction, and the phenomenon is that the physiological indicators rise or fall simultaneously. Studies have shown that PS can be used to measure whether the interaction is effective or whether students are focused on the same item (Stuldreher et al., 2020a , 2020b ). It was found that students who shared their reflected views also showed higher physiological synchrony (Haataja et al., 2018 ). As CPS is rooted in the social constructivist view of learning, which asserts that in-depth learning occurs when students engage in building a shared understanding of a problem through social interactions (Jermann & Dillenbourg, 2008 ; Pear & Crone-Todd, 2002 ). Thus the PS in the CPS process is mainly influenced by the interaction effectivity, and the relationship between PS and learning between students and teachers is worth studying (Davidesco, 2020 ; Nam et al., 2020 ). Dindar, Järvelä, et al. ( 2020 ) recorded students’ EDA in CPS and analyzed the relationship between PS and metacognitive experiences. The PS was calculated through a Multidimensional Recurrence Quantification Analysis (MdRQA). The results show a positive relationship between continuous PS episodes and groups’ collective mental effort. Dindar, Malmberg, et al. ( 2020 ) investigated the interplay of temporal changes in self-regulated learning processes (i.e., behavioral, cognitive, motivational, and emotional) and their relationship with academic achievement in computer-supported collaborative learning. The PS of the dyads in the collaborating groups was determined by calculating a single session index. The results show that PS among the collaborating students was found to be related to cognitive regulation. Sobocinski et al. ( 2021 ) collected heart rate data and videos of students during collaboration. The authors combined video observation and PS as a possible indicator to identify monitoring and adaptation events. The studies have shown that PS is an effective indicator to reflect the process of collaborative learning.

Scalp-recorded electric potentials or electroencephalograms (EEGs) are the most popular instruments to collect a participant’s brain wave signals. The signals provide estimates of synaptic action at large scales that are closely related to behavior and cognition. Thus, EEG has been recognized as a genuine “window on the mind” (Nunez & Srinivasan, 2006 ). The original EEG records electric potentials and can be further divided into specific ranges through the frequency, namely the delta (1–4 Hz), theta (4–7 Hz), alpha (8–13 Hz), beta (13–30 Hz), and gamma (> 30 Hz) bands (Alarcao & Fonseca, 2019 ). Different wavebands of the EEG reflect different types of activity in the brain (Alarcao & Fonseca, 2019 ). The literature shows that the delta band is related to signal detection or the unconscious mind (Alarcao & Fonseca, 2019 ). The theta band is positively correlated with working memory load or cognitive load (Muthukrishnan et al., 2020 ). The alpha band is related to cognitive load and mediation (Chen & Wang, 2018 ; Yang et al., 2019 ). The beta band is related to attention and decision-making (Chen & Wang, 2018 ; Yang et al., 2019 ). The gamma band has been demonstrated in a wide range of brain processes, including multisensory and sensorimotor integration, attention, memory formation, and perceptual binding (Chand et al., 2016 ; Min et al., 2016 ). The relationships between the EEG bands and brain activities are shown in Table 1 .

BS is a type of PS. It refers to the synchronization of brain activity between two or more people (Nam et al., 2020 ). Compared with PS reflected by EDA and heart rate data, BS can reflect students' cognitive states more accurately (Stuldreher et al., 2020a , 2020b ). Dikker et al. ( 2017 ) studied the relationship between BS and the self-reported engagement of twelve students in a traditional classroom. BS was computed using the method of total interdependence (Wen et al., 2012 ). The authors found that students with a higher level of BS had higher levels of engagement and social dynamics during the lecture. Bevilacqua et al. ( 2019 ) came to a similar conclusion in their study. The authors calculated the level of BS between students and the teacher and studied the relationship between the level of BS and self-reported engagement level of twelve students in an offline lecture. The results show that students with a higher level of BS with the teacher had higher levels of perceived engagement and closeness. Davidesco et al. ( 2019 ) studied the relationship between BS and academic performance. The authors calculated the BS between students and between students and teachers in a traditional classroom. The results show that students with high performance had higher BS with teachers and that the BS between students was more pronounced when they learned what they got wrong on the pretest and right on the posttest. Due to the limitations of devices, the sample sizes of the above studies were between twelve and thirty-six. These studies have shown that BS is an indicator that reflects academic performance and the learning process. The above studies showed that BS can provide more insights and make up some gaps in studies that rely on perceptional data only. However, the studies discussed the relationship between BS and the learning process in a traditional classroom without collaboration. More research efforts should be expended, to understand the CPS process, and to study the unique findings that can be extracted from BS. The literature shows that most CPS studies analyze the process through PS (Dindar, Järvelä, et al., 2020 ; Dindar, Malmberg, et al., 2020 ; Sobocinski et al., 2021 ). As CPS is a process of building a shared understanding (Jermann & Dillenbourg, 2008 ; Pear & Crone-Todd, 2002 ), BS can better serve the research, since it can reflect students' cognitive state more accurately (Stuldreher et al., 2020a , 2020b ). Most of the studies have a limit on analyzing the original EEG signals (Bevilacqua et al., 2019 ; Davidesco et al., 2019 ; Dikker et al., 2017 ). Since different wavebands of the EEG reflect different types of activity in the brain (Alarcao & Fonseca, 2019 ), analyzing the BS through different wavebands, rather than through the original signal, will help reveal the CPS in more detail.

Participants

The participants comprised thirty-six undergraduates (15 males and 21 females) from a higher education institution in China. The participants were recruited from one class and were randomly assigned into groups with three students. The average age of the participants was 21.35. All students self-assessed their CPS skills through the social problem-solving inventory revised survey (SPSI–R) (D’Zurilla et al., 2002 ) during the pretest. Each participant was informed of the purpose and procedure of the experiment, and each signed an informed consent form for the experiment. Everyone became used to wearing an electroencephalograph in class after a semester of adaptation.

The experiment was carried out in a simulated online CPS environment in a Computer Networking course that adopted a network simulator called Cisco Packet Tracer for online collaboration. The visualization and network simulation tool allows students to construct and program network devices collaboratively and observe the outcomes in a real-time matter.

The CPS task was a simulation task in which students, acting as the network administrators of the school, discussed how to construct a network to enable network interactions among three colleges and, at the same time, to meet the needs of each college for the use of the network. To better solve the task, students needed to consider how to allocate a CIDR address block to three colleges and meet the requirements of the number of IPs in each college. Since each college had different IP numbers and LAN requirements, each student in the group needed to set up a college network and select the right number of routers, switches, and servers. To explore the process of online CPS, the collaborative task was divided into two stages [i.e., the collaborative problem-understanding stage (PUS) and the collaborative problem-solving stage (PSS)] (Jermann, 2004 ). The problem-solving question given at the problem-understanding stage was “ The university’s IT office is going to assign the following subnet, 192.0.64.0/22, to four colleges. As a network administrator of the IT office, you oversee allocating these IP addresses to meet the needs of individual colleges and at the same time, simplify network management and optimize network performance. The total numbers of IP addresses needed by individual colleges are School of Mathematics (required 126 IPs), School of Physics (required 120 IPs), School of Chemistry (required 500 IPs), and School of Biology (required 240 IPs). In addition, each college needs its subnet. You need to work with your group members to compile a table to list the needs of individual colleges after group discussion, including the following table columns— the binary number required for the host number, the design-assigned network number, the subnet mask, the maximum available address, and the minimum available address. ” At the problem-solving stage, each group needs to complete the network configurations of individual colleges and the necessary configurations to enable communications among colleges and devices. All networking tasks were completed on the Cisco Packet Tracer, which allows multiple users to configure a network independently or collaboratively. Therefore, a group can choose to allocate each of the group members different tasks, or they can just work on the same task together, depending on their problem-solving strategies. As part of the learning outcomes, each group needs to complete a network topology for individual colleges with configured network devices, including routers, switches, and end-user devices. All activities were completed fully online and students used the VooV meeting ( https://voovmeeting.com/ ) for group discussion. The tool provided real-time video conferencing with functions of a whiteboard, screen share, and collaborative annotations. All meetings were recorded for discussion content analysis.

The EEG signals were collected through portable EEG devices during the collaborative activity. Both traditional electroencephalograph and portable EEG devices can measure EEG. The portable EEG device has fewer channels than a traditional electroencephalograph, but it offers similar results (Li et al., 2020 ). Moreover, the portable EEG device is easy to wear, and it can be applied on a large scale in a real classroom environment, which a traditional electroencephalograph cannot do (Xu & Zhong, 2018 ). In this experiment, we used a type of brain wave monitoring device, the core component of which was the ThinkGear Asic Module (TGAM). The sampling frequency of the device is 512 HZ and, in research, it can be used on the forehead (referred to as the FP1 area in neuroscience) to measure high-precision electroencephalogram signals. The reliability and the accuracy of the equipment have been verified by relevant studies (Rebolledo-Mendez et al., 2009 ; Yasui, 2009 ). The device adopted in this study generates the following EEG signals: delta, theta, low alpha, high alpha, low beta, high beta, low gamma, high gamma, attention, and mediation. The first eight signals were separated from the original EEG signal and the rest of the two were computed from the device's built-in algorithms.

At the beginning of the experiment, students wore portable EEG devices and were told about the procedure. After that, students had ten minutes to finish the SPSI–R survey. Next, during the problem-understanding stage, each group had fifteen minutes to understand the problem by observing and analytical reasoning before starting with the problem-solving task by experiment design and hypothesis verification. Each of the groups was asked to submit a worksheet that lists key information of their group-generated solution in IP address range, the number of usable devices, subnet masks, broadcast IDs, and network IDs. Then, at the problem-solving stage, each group had another thirty minutes to complete the network configurations on Cisco Packet Tracer. The performance of the group was evaluated by the instructor and two teacher assistants based on the following criteria: (1) whether the group fully listed the overall networking needs and the needs of individual colleges; (2) whether all devices can communicate with each other; (3) whether the network speed is optimized; (4) whether the network layout is easy to manage and maintain. The procedure of the CPS activity is shown in Table 2 .

Data analysis

Brain to brain synchrony.

To compute BS between students, we employed a synchrony measure known as phase locking value (PLV) (Perez et al., 2017 ), which was calculated for every pair of students in the group, across all brain wavebands. The PLV is an indicator of the level of BS within the value range from zero to one. A higher PLV value means a higher level of BS. This measure reflects the mean phase coherence of angular distribution. The PLV is expressed in Eq. ( 1 ), where T is the number of time points, \({\mathrm{\varphi }}_{(t,n)}\) is the phase of trial at time t in band n, in student \(\mathrm{\varphi }\) , \({\uppsi }_{(t,n)}\) in student \(\uppsi\) .

Statistical analysis

The EEG data were collected during problem-understanding and problem-solving stages, each student collected 45 min of EEG data and generated a record containing all band intensities every second. A total of 97,200 pieces of EEG data were collected in this experiment. This research focuses on the analysis of the characteristics of BS in CPS. To eliminate the interference of the individual signal strength, the intensity of each band was transformed via range transformation (Wang et al., 2021 ). The PLV was used to measure the level of BS between group members. Therefore, the PLV between any of two students in the same group was calculated.

The characteristics of BS in online CPS

Initially, to analyze the difference in the BS characteristics between the problem-understanding and problem-solving stages, T-tests were conducted to compare BS levels among students between two CPS stages (see Table 3 ). Table 3 shows there are significant BS level differences in the brain activities including attention ( \(\upbeta\) ), mediation (low \(\mathrm{\alpha }\) ), cognitive load ( \(\uptheta\) and low \(\mathrm{\alpha }\) ), decision making ( \(\upbeta\) ), memory ( \(\upgamma\) ), and perception ( \(\upgamma\) ) between CPS stages. In these bands, the problem-understanding stage had a significantly higher BS level than the problem-solving stage.

Secondly, to analyze the relationship between the level of BS and task performance. The linear relationship between task performance and BS in each band was analyzed. The task performance was scores of collaborative task reports graded by the teacher. The dependent variable was task performance. The independent variables were the level of BS in each stage and each band. Factors extracted from the Linear Regression model through the backward method can explain 51.3% of the variances in groups’ task performance (see Table 4 ). Significant factors were the level of BS in brain activities including unconscious mind ( \(\updelta\) ), attention (attention), memory (high \(\upgamma\) ), and perception (high \(\upgamma\) ) at the problem-solving stage.

All of the significant factors were from the problem-solving stage. That means BS levels at the problem-solving stage directly influenced the CPS task performance. Moreover, the synchrony in attention was negatively correlated with the CPS task performance.

The differences in BS between collaborative groups with different CPS skills

To reveal how CPS skills influenced group performance from the aspects of BS levels in different brainwave signals, ANOVA was conducted to compare BS levels of different group constitutions at two CPS stages. To understand the constitutions of students with different CPS skills in groups, the overall average was used as the baseline. CPS skills were higher than the baseline called high CPS students. CPS skills were lower than the baseline called low CPS students. The high group (HG, hereafter) represents a group consisting of all high CPS students, and the low group (LG, hereafter) represents a group consisting of all low CPS students. Then the High-Low Group (HLG, hereafter) represents a group with a mix of high and low CPS students. After random grouping, there were two HGs, one LG, and ten HLGs in this study.

After figuring out the CPS skills of each group, ANOVA was used to analyze the characteristics of BS across different wavebands in different types of collaborative groups. The results are shown in Table 5 . It can be found that, at the problem-understanding stage, three types of collaborative groups showed a significant difference in the level of BS in the brain activities including unconscious mind ( \(\updelta\) ), attention (high \(\upbeta\) ), decision making (high \(\upbeta\) ), memory (high \(\upgamma\) ), and perception (high \(\upgamma\) ). At the problem-solving stage, three types of collaborative groups showed a significant difference in the level of BS in the brain activities including unconscious mind ( \(\updelta\) ), memory (high \(\upgamma\) ), and perception (high \(\upgamma\) ). The Scheffe post-hoc method was used for further comparisons. The following bands show significant results.

The problem-understanding stage

Delta: LG > HLG > HG, only LG > HG and HLG > HG were significant

High beta: HG > HLG > LG, only HG > LG and HLG > LG were significant

High gamma: HG > HLG > LG, only HG > HLG and HG > LG were significant

The problem-solving stage

Delta: LG > HLG > HG, only HLG > HG was significant

High gamma: HG > HLG > LG, only HG > LG was significant

Because Table 5 found significant BS level differences in the following aspects—unconscious mind ( \(\updelta\) ), attention (high \(\upbeta\) ), decision making (high \(\upbeta\) ), memory (high \(\upgamma\) ), and perception (high \(\upgamma\) ), these signals’ average PLV values were computed to further explore the characteristics of BS in the HG, LG, and HLG groups. The delta band was excluded from the comparisons because its effect on learning is still unknown. Figures  1 , 2 , 3 compare the average PLV values of individual groups in beta (problem-understanding), gamma (problem-understanding), and gamma (problem-solving) respectively. HG, HLG, and LG are coded in green, yellow, and red. The dashed lines represent the lowest average PLV values among HG groups. Generally speaking, HG groups show higher BS levels and LG groups show lower BS levels in the following aspects—unconscious mind (δ), attention (high β), decision making (high β), memory (high γ), and perception (high γ). However, HLG can be divided into two conditions—HLG with higher BS levels and HLG with lower BS levels. HLG groups with higher BS levels represent the groups’ BS levels that are over the dashed line, such as group 3 in Fig.  2 . After examining other data, HLG groups with higher BS levels have the following characteristics. Each of the groups contains at least one high CPS skill student. In Fig.  1 , the high CPS skill student showed a higher level of beta-band intensity. That means the student was very focused and tended to be positive thinking. The student led the group discussions resulted in a higher BS level. In Figs.  2 and 3 , the high CPS skill student showed a higher level of gamma-band intensity. That means the student was trying to understand the discussion contents and recall the knowledge related to the problem.

figure 1

The mean of PLV values of beta bands at the problem-understanding stage

figure 2

The mean of PLV values of high gamma bands at the problem-understanding stage

figure 3

The mean of PLV values of high gamma bands at the problem-solving stage

Qualitative analysis of group discussions

To further validate or interpret analytic results from the EEG signals, all group conversations were recorded and transcribed. The qualitative part mainly focuses on comparing discussion frequencies and contents in HG, LG, and HLG groups. In the previous section, groups of HG, LG, and HLG show significant differences in the following signals—the high gamma (at both stages) and the low beta (at the problem understanding stage). These signals were used to further divide HLG groups into HLG with high BS levels (if the group showed similar average BS levels as the HG groups) and HLG groups with low BS levels (if the group showed lower average BS levels than the HG groups). The results of interaction frequency (see Table 6 ) show that HG groups had the highest average frequencies of interactions, while LG groups had the lowest average frequencies of interactions. HLG groups with high BS levels show higher interaction frequencies than HLG with low BS levels. Overall, groups with higher BS levels had intensive oral discussions at both stages. Moreover, HG groups had higher interaction frequencies at the problem-understanding stage than in the problem-solving stage, while the other groups had higher interaction frequencies at the problem-solving stage. From the aspect of discussion contents, HG groups focused on the discussions of knowledge related to the problem-solving, for strategy confirmation or knowledge exchange at the problem-solving stage. For example, HG students had the conversions like “ The maximum address here is 67. So, so the 8-bit is 255.0, right? ” and “ Why do we need to change the subnet mask? I don’t think we need to change the subnet mask. ” At the problem-solving stage, in addition to discussing the knowledge related to the problem and the solution, they also frequently shared screens to better explain their ideas or network configurations. On the other hand, the LG groups had fewer oral discussions at both stages. At the problem-understanding stage, LG students mainly discussed task allocation, but not the strategies or knowledge related to problem-solving. At the problem-solving stage, LG students mainly focused on completing their tasks. For example, LG group students had the following conversations. “ What is your gateway? ” and “ I will ping you. ” The HLG with high BS levels had similar collaboration patterns to the HG groups. The HLG groups also focused on the discussions of knowledge related to the problem-solving, for strategy confirmation or knowledge exchange at the problem-solving stage. In addition, the students with high CPS skills led most of the discussions. At the problem-understanding stage, HLG with high BS levels had conversions like “ I don’t think the third is 255 since all the ones in front and all the zeros in the back. ” At the problem-solving stage, they discussed the knowledge related to the solution and shared their screens frequently. The HLG with low BS levels acted like the LG group. The students in HLG with low BS levels had little verbal interactions. The students with high CPS skills try to lead the discussion, but other students in the group responded with very short answers and just obeyed the orders without rich discussions. For example, the students with high CPS gave orders “You put in two PCs and one switch. Yeah. Two. Yeah.” and “You can click setting, setting, and set the gateway to 192.0.65.1.” Another situation in HLG groups with low BS levels is that one student tried to solve the problem alone. There were conversions like “I’ll figure everything out and then I'll tell you how to fill the blanks.”. Based on the results, HG and HLG with high BS levels had better discussions on the aspects of interaction frequencies and content quality.

The problem-understanding stage had a significantly higher level of brain-to-brain synchrony than the problem-solving stage

Table 3 compared the BS levels of different brainwave signals at different CPS stages. Below summarizes significant results and their corresponding brain activities.

The problem-understanding stage > the problem-solving stage in the following signals:

Theta: positively correlated with cognitive load

Low alpha: negatively correlated with cognitive load, related with mediation

Low and high beta: related to attention and decision making

Low and high gamma: related to multisensory and sensorimotor integration, memory formation, and perceptual binding

The problem-understanding stage had a significantly higher level of BS than the problem-solving stage. The significant difference in the gamma band has been analyzed before; students try to understand the same problem, so that has a higher synchronization in memory and perception (Chand et al., 2016 ; Min et al., 2016 ). The theta band is positively correlated with the cognitive load (Muthukrishnan et al., 2020 ). The higher level of synchrony of the theta band at the problem-understanding stage may also be since students read the same question and try to reach a consistent understanding of the question during this process, so they have a higher level of synchrony in the cognitive load. The beta band is related to the attention and decision-making activities of the human brain (Chen & Wang, 2018 ; Yang et al., 2019 ). Students at the problem-understanding stage show a higher level of synchrony in this band, which may be because students have common attention objects and are thinking about the same problem. Previous studies showed the BS level is highly corrected with oral interaction. At the problem-understanding stage, students tried to understand the same questions and requirements as a group. However, individual students might be assigned to focus on different tasks at the problem-solving stage. Therefore, the levels of BS were decreased. Because of the characteristic differences between these two stages, it might be better to design a CPS with two or more stages. In addition, supporting or encouraging group discussions is crucial at the problem-understanding stage. Strategies like question prompt or role-play strategies might be used to support group discussions.

The brain-to-brain synchrony of the attention, delta, and high gamma bands at the problem-solving stage is significantly correlated with task performance

Table 4 identified what brainwaves’ BS levels are significant predictors at both CPS stages toward the quality of the final solutions proposed by individual groups. Below are summaries of significant performance predictors:

The problem-understanding stage:

The problem-solving stage:

Attention (related to focusing attention): the BS is significantly negatively correlated with performance.

Delta (related to the unconscious mind, signal detection): The BS is significantly positively correlated with performance.

High gamma (related to multisensory and sensorimotor integration, memory formation, and perceptual binding): the BS is significantly positively correlated with the performance.

The results indicate all significant predictors came from the problem-solving stage. The results implied the importance of CPS skills during the problem-solving stage. As each of the group members might be assigned different tasks, how coordinating the collaboration process and integrating everyone’s work into the final solution is the key to success. Therefore, group members’ CPS skills might play an important role at this stage. The inference is also supported by comparing BS levels and performance among different group constitutions. The BS of Attention at the problem-solving stage is negatively correlated with task performance, which may be because group members no longer need to focus on a specific content together at the problem-solving stage. Instead, individuals focused and worked on different points. The BS in the gamma band at the problem-solving stage is positively correlated with task performance. The gamma band is related to brain memory perception and other activities. The BS in this band indicates that students memorize or understand at the same time when discussing solutions, and such BS in memory and understanding is conducive to collaborative learning. The findings consisted of conclusions from related works. Kwon et al. ( 2019 ) found that investment at the problem-solving stage was more conducive to improving collaboration performance than an investment at the problem-understanding stage. Further, through the exploration of the collaborative learning mode, relevant studies have found that the main reason for the difference in the performance of the collaboration group lies in whether the plan is effectively discussed and improved at the stage of problem-solving (Chang et al., 2017 ; Zheng et al., 2020 ). In addition, the BS of the delta band is also positively correlated with task performance. Delta band is related to subconscious brain activity (Alarcao & Fonseca, 2019 ). The meaning of the BS of the delta band and its significance in collaborative learning remains to be further explored.

Since the BS at the problem-solving stage is more important to the performance of collaborative tasks than that at the problem-understanding stage, it is necessary to strengthen students' collaboration in this portion or to add appropriate guidance when designing online collaborative learning. The detection of the BS in online collaborative activities and the intervention based on collaborative efficiency obtained through the BS should also focus on the problem-solving stage.

Different types of collaboration groups have significant brain-to-brain synchrony differences in the delta and high gamma bands during the problem-understanding stage and the problem-solving stage

Table 5 compares the BS levels of different brainwave signals among three group constitutions at both BS stages. The significant signals and their corresponding brain activities can be summarized as follows:

Delta (related to the unconscious mind, signal detection): LG > HG and HLG > HG

High beta (related to attention and decision making): HG > LG and HLG > LG

High gamma (related to multisensory and sensorimotor integration, memory formation, and perceptual binding): HG > HLG and HG > LG

Delta (the unconscious mind, signal detection): HLG > HG

High gamma (related to multisensory and sensorimotor integration, memory formation, and perceptual binding): HG > LG

Only HLG and LG groups had significant BS levels in delta signals. These findings might imply the delta signal is correlated with the low prior CPS experiences or low CPS skills. However, as mentioned earlier, how this signal is related to learning is still unclear (Alarcao & Fonseca, 2019 ). Therefore, additional studies can be conducted to reveal their relationships. In the high gamma band, the HG had higher BS, indicating that the collaborative group composed of students with high CPS skills had a common focus on memory and the perception of content. In addition, at the problem-understanding stage, the HG had higher BS in the high beta band, which did not appear at the problem-solving stage, indicating that the students in HG had a higher level of synchrony of attention in the problem comprehension. Every group member was clear about their role and tasks.

The high-low groups could have similar brain-to-brain synchrony levels to the high group

By plotting the PLV values and the mean band intensity of any two students in each group (see Figs. 1, 2, 3), it was found that the HLG had polarization in PLV values. In the high beta and high gamma bands of both stages, several of the HLG had similar BS levels to the HG. In these groups, it was found that the students with high CPS skills were more active in brain activity, played a leading role in the process of collaboration, and guided the rest of the students to cooperate. The results consisted of the findings of (Andrews-Todd & Forsyth, 2020 ), that collaborative groups were more likely to have better academic performance when there was at least one student with strong CPS ability in the group.

The qualitative analyses show that the HG and HLG with high BS levels had higher conversation frequencies and discussion quality, while the LG and HLG with low BS levels had low conversation frequencies and discussion quality. This means that groups with a higher level of BS have more effective interactions. The result validated our findings, (a) BS levels can be an effective indicator to evaluate the process of CPS, and (b) groups with different CPS skills have different characteristics on BS levels. In addition, HLG groups might not always have effective discussions. As HLG might be the most common group constitution in CPS, how to support and facilitate an HLG group conducting effective discussions is important in the CPS activity. Moreover, HG groups had higher interaction frequencies at the problem-understanding than the problem-solving stage, while the other groups had a higher interaction frequency at the problem-solving stage. Based on the conversation observation, HG groups had discussed the problems sufficiently at the problem-understanding stage, so group members mainly focused on the solution-related discussions at the problem-solving stage. On the other hand, the rest of the groups often overlooked or missed some key discussion points at the problem-understanding stage, and had to discuss these missing parts at the problem-solving stage. Therefore, instructors might consider providing additional guidance to support group discussions on the problem-understanding.

A group with only high CPS skill students could have better collaborative performance in online collaborative learning. However, this kind of group is not conducive to the learning of all students. Rather, organizing the group with students at different levels of CPS skills is better for all students and this model can achieve BS levels similar to those of the HG. To make this kind of mixed type group achieve better collaborative performance, it is necessary to conduct effective guidance. How to evaluate the interaction quality through the students’ BS levels, and how to provide intervention so that all the groups can better collaboratively learn are questions to be studied.

Limitations

This study also has the following limitations. To capture the actual brainwave activities during the CPS activities, this study did not group students based on their CPS skill levels. The random grouping generated unbalanced numbers in HG, HLG, and LG groups. This study was limited to the maximum number of concurrent Bluetooth device connections, and only recruited thirty-six participants in a computing course (i.e., Computer Network course). Although the sample size is similar to related studies, larger sample size may help to discover more interesting and generalizable findings. In addition, findings in this study might not be generalized to different subject areas.

This study analyzed students’ learning processes in the CPS activity from the aspects of cognitive neuroscience. First, BS shows as an effective indicator for observing group interactions during collaborative problem-solving and provides insights for teachers and researchers to further understand the CPS process. Second, the analytic results show common and unique characteristics at the problem-understanding and problem-solving stages. The problem-understanding stage had a significantly higher BS level than the problem-solving stage in most of the EEG bands. The results show expectations and requirements of these two stages are different and might require different CPS skills to achieve better learning outcomes. Therefore, how to support or evaluate individual students at these two stages more effectively would be a follow-up study. BS can still serve as an indicator to observe how personalized supports impact students’ EEG signals. Third, although the problem-solving stage had a lower BS level in most of the EEG bands, the results indicate the BS levels at the problem-solving stage directly influenced the CPS task performance. The findings also validated the conclusions from other studies. Finally, groups with higher BS levels showed more effective interactions in terms of discussion frequencies and discussion quality. Instructors should avoid assigning a group with all low CPS skill students or should provide basic CPS skill training before working on the CPS activity. However, this study also found that HLG groups (i.e., a mix of high and low CPS skill students) might lead to totally different interactions. As the combination is likely to be the most common type in practice, how to provide in-time and personalized support to foster effective interactions becomes an important research topic. Future studies can concentrate on the development of early-warning mechanisms or effective discussion interventions by tracking group BS levels, especially focusing on the HLG group constitution.

As an instructional approach, a CPS activity aims to get students engaged in the instructional activity and cultivate students’ CPS skills. This study provides evidence from aspects of cognitive neuroscience to support its effectiveness as an instructional approach in online learning. With the development of emerging technologies, more and more wearable devices can be used to track students’ physiological changes during the learning process. This study serves as a starting point in this endeavor. More research efforts in this area can be expected in near future.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

Collaborative problem solving

Brain-to-brain synchrony

Electrodermal activity

Physiological synchrony

Electroencephalogram

High-low group

Alarcao, S. M., & Fonseca, M. J. (2019). Emotions recognition using EEG signals: A survey. IEEE Transactions on Affective Computing, 10 (3), 374–393. https://doi.org/10.1109/taffc.2017.2714671

Article   Google Scholar  

Andrews, J. J., & Rapp, D. N. (2015). Benefits, costs, and challenges of collaboration for learning and memory. Translational Issues in Psychological Science, 1 (2), 182.

Andrews-Todd, J., & Forsyth, C. M. (2020). Exploring social and cognitive dimensions of collaborative problem solving in an open online simulation-based task. Computers in Human Behavior, 104 , 105759.

Ashwin, T. S., & Guddeti, R. M. R. (2020). Automatic detection of students’ affective states in classroom environment using hybrid convolutional neural networks. Education and Information Technologies, 25 (2), 1387–1415. https://doi.org/10.1007/s10639-019-10004-6

Barron, B. (2003). When smart groups fail. The Journal of the Learning Sciences, 12 (3), 307–359.

Bevilacqua, D., Davidesco, I., Wan, L., Chaloner, K., Rowland, J., Ding, M., Poeppel, D., & Dikker, S. (2019). Brain-to-brain synchrony and learning outcomes vary by student-teacher dynamics: Evidence from a real-world classroom electroencephalography study. Journal of Cognitive Neuroscience, 31 (3), 401–411. https://doi.org/10.1162/jocn_a_01274

Care, E., Griffin, P., & McGaw, B. (2012). Assessment and teaching of 21st century skills . Springer.

Google Scholar  

Chand, G. B., Lamichhane, B., & Dhamala, M. (2016). Face or house image perception: Beta and gamma bands of oscillations in brain networks carry out decision-making. Brain Connectivity, 6 (8), 621–631. https://doi.org/10.1089/brain.2016.0421

Chanel, G., & Muhl, C. (2015). Connecting brains and bodies: Applying physiological computing to support social interaction. Interacting with Computers, 27 (5), 534–550. https://doi.org/10.1093/iwc/iwv013

Chang, C. J., Chang, M. H., Chiu, B. C., Liu, C. C., Chiang, S. H., Wen, C. T., Hwang, F. K., Wu, Y. T., Chao, P. Y., Lai, C. H., & Wu, S. W. (2017). An analysis of student collaborative problem solving activities mediated by collaborative simulations. Computers Education, 114 , 222–235.

Chen, C. M., & Wang, J. Y. (2018). Effects of online synchronous instruction with an attention monitoring and alarm mechanism on sustained attention and learning performance. Interactive Learning Environments, 26 (4), 427–443. https://doi.org/10.1080/10494820.2017.1341938

Article   MathSciNet   Google Scholar  

Chi, M. T., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5 (2), 121–152.

Chi, M. T., Glaser, R., & Rees, E. (1981b). Expertise in problem solving.

Critchley, H. D., Eccles, J., & Garfinkel, S. N. (2013). Interaction between cognition, emotion, and the autonomic nervous system. In Handbook of clinical neurology (Vol. 117, pp. 59–77): Elsevier.

Critchley, H. D., & Garfinkel, S. N. (2018). The influence of physiological signals on cognition. Current Opinion in Behavioral Sciences, 19 , 13–18.

Cukurova, M., Giannakos, M., & Martinez-Maldonado, R. (2020). The promise and challenges of multimodal learning analytics. British Journal of Educational Technology, 51 (5), 1441–1449. https://doi.org/10.1111/bjet.13015

D’Zurilla, T. J., Nezu, A. M., & Maydeu-Olivares, A. (2002). Social problem-solving inventory-revised. APA PsycTests .

Davidesco, I. (2020). Brain-to-brain synchrony in the STEM classroom. CBE Life Sciences Education, 19 (3), es8.

Davidesco, I., Laurent, E., Valk, H., West, T., Dikker, S., Milne, C., & Poeppel, D. (2019). Brain-to-brain synchrony between students and teachers predicts learning outcomes. BioRxiv , 644047.

Dikker, S., Wan, L., Davidesco, I., Kaggen, L., Oostrik, M., McClintock, J., Rowland, J., Michalareas, G., Van Bavel, J. J., Ding, M., & Poeppel, D. (2017). Brain-to-brain synchrony tracks real-world dynamic group interactions in the classroom. Current Biology, 27 (9), 1375–1380. https://doi.org/10.1016/j.cub.2017.04.002

Dindar, M., Järvelä, S., & Haataja, E. (2020). What does physiological synchrony reveal about metacognitive experiences and group performance? British Journal of Educational Technology, 51 (5), 1577–1596.

Dindar, M., Malmberg, J., Järvelä, S., Haataja, E., & Kirschner, P. A. (2020). Matching self-reports with electrodermal activity data: Investigating temporal changes in self-regulated learning. Education Information Technologies, 25 (3), 1785–1802.

Haataja, E., Malmberg, J., & Järvelä, S. (2018). Monitoring in collaborative learning: Co-occurrence of observed behavior and physiological synchrony explored. Computers in Human Behavior, 87 , 337–347.

Hayes, J. R. (2013). The complete problem solver . Routledge.

Book   Google Scholar  

Herborn, K., Stadler, M., Mustafić, M., & Greiff, S. (2020). The assessment of collaborative problem solving in PISA 2015: Can computer agents replace humans? Computers in Human Behavior, 104 , 105624.

Jermann, P. R. (2004). Computer support for interaction regulation in collaborative problem-solving. Verlag nicht ermittelbar,

Jermann, P., & Dillenbourg, P. (2008). Group mirrors to support interaction regulation in collaborative problem solving. Computers Education, 51 (1), 279–296.

Kwon, K., Song, D., Sari, A. R., & Khikmatillaeva, U. (2019). Different types of collaborative problem-solving processes in an online environment: Solution oriented versus problem oriented. Journal of Educational Computing Research, 56 (8), 1277–1295. https://doi.org/10.1177/0735633117740395

Li, Q., Ren, Y., Wei, T., Wang, C., Liu, Z., & Yue, J. (2020). A Learning Attention Monitoring System via Photoplethysmogram Using Wearable Wrist Devices. In Artificial Intelligence Supported Educational Technologies (pp. 133–150): Springer.

Min, L., Guang, M., Wenming, Z., Wade, J., & Sarkar, N. (2016). Brain gamma oscillations of healthy people during simulated driving. Intelligent Robotics and Applications. 9th International Conference, ICIRA 2016. Proceedings: LNAI 9835 , 453–458. https://doi.org/10.1007/978-3-319-43518-3_43

Muthukrishnan, S. P., Soni, S., & Sharma, R. (2020). Brain networks communicate through theta oscillations to encode high load in a visuospatial working memory task: An EEG connectivity study. Brain Topography, 33 (1), 75–85. https://doi.org/10.1007/s10548-019-00739-3

Nam, C. S., Choo, S., Huang, J., & Park, J. (2020). Brain-to-brain neural synchrony during social interactions: A systematic review on hyperscanning studies. Applied Sciences, 10 (19), 6669.

Noroozi, O., Pijeira-Díaz, H. J., Sobocinski, M., Dindar, M., Järvelä, S., & Kirschner, P. A. (2020). Multimodal data indicators for capturing cognitive, motivational, and emotional learning processes: A systematic literature review. Education Information Technologies, 25 , 5499–5547.

Nunez, P. L., & Srinivasan, R. (2006). A theoretical basis for standing and traveling brain waves measured with human EEG with implications for an integrated consciousness. Clinical Neurophysiology, 117 (11), 2424–2435.

OECD. (2017). PISA 2015 Results (Volume V). Collaborative Problem Solving.

Palumbo, R. V., Marraccini, M. E., Weyandt, L. L., Wilder-Smith, O., McGee, H. A., Liu, S., & Goodwin, M. S. (2017). Interpersonal autonomic physiology: A systematic review of the literature. Personality Social Psychology Review, 21 (2), 99–141.

Pear, J. J., & Crone-Todd, D. E. (2002). A social constructivist approach to computer-mediated instruction. Computers Education, 38 (1–3), 221–231.

Pecchinenda, A. (1996). The affective significance of skin conductance activity during a difficult problem-solving task. Cognition & Emotion, 10 (5), 481–504.

Perez, A., Carreiras, M., & Dunabeitia, J. A. (2017). Brain-to-brain entrainment: EEG interbrain synchronization while speaking and listening. Scientific Reports, 7 , 12. https://doi.org/10.1038/s41598-017-04464-4

Rebolledo-Mendez, G., Dunwell, I., Martínez-Mirón, E. A., Vargas-Cerdán, M. D., De Freitas, S., Liarokapis, F., & García-Gaona, A. R. (2009). Assessing neurosky’s usability to detect attention levels in an assessment exercise. Paper presented at the International Conference on Human-Computer Interaction.

Schulze, J., & Krumm, S. (2017). The “virtual team player”: A review and initial model of knowledge, skills, abilities, and other characteristics for virtual collaboration. Organizational Psychology Review, 7 (1), 66–95. https://doi.org/10.1177/2041386616675522

Sharma, K., & Giannakos, M. (2020). Multimodal data capabilities for learning: What can multimodal data tell us about learning? British Journal of Educational Technology, 51 (5), 1450–1484. https://doi.org/10.1111/bjet.12993

Simon, H. A., & Hayes, J. R. (1976). The understanding process: Problem isomorphs. Cognitive Psychology, 8 (2), 165–190.

Sobocinski, M., Malmberg, J., & Järvelä, S. (2021). Exploring adaptation in socially-shared regulation of learning using video and heart rate data. Technology, Knowledge Learning , 1–20.

Stadler, M., Herborn, K., Mustafić, M., & Greiff, S. (2020). The assessment of collaborative problem solving in PISA 2015: An investigation of the validity of the PISA 2015 CPS tasks. Computers Education, 157 , 103964.

Stuldreher, I. V., Thammasan, N., Van Erp, J. B., & Brouwer, A.-M. (2020a). Physiological synchrony in EEG, electrodermal activity and heart rate detects attentionally relevant events in time. Frontiers in Neuroscience, 14 , 1257.

Stuldreher, I. V., Thammasan, N., van Erp, J. B., & Brouwer, A.-M. (2020b). Physiological synchrony in EEG, electrodermal activity and heart rate reflects shared selective auditory attention. Journal of Neural Engineering, 17 (4), 046028.

Wang, Y. W., Yang, X., Yao, X. Q., & Fu, C. (2021). Computational methods of brain-to-brain coupling during human interaction. Chinese Science Bulletin-Chinese, 66 (4–5), 501–514. https://doi.org/10.1360/tb-2020-0642

Wen, X. T., Mo, J., & Ding, M. Z. (2012). Exploring resting-state functional connectivity with total interdependence. NeuroImage, 60 (2), 1587–1595. https://doi.org/10.1016/j.neuroimage.2012.01.079

Xu, J. H., & Zhong, B. C. (2018). Review on portable EEG technology in educational research. Computers in Human Behavior, 81 , 340–349. https://doi.org/10.1016/j.chb.2017.12.037

Yang, X. M., Zhao, X. S., Tian, X. S., & Xing, B. B. (2019). Effects of environment and posture on the concentration and achievement of students in mobile learning. Interactive Learning Environments . https://doi.org/10.1080/10494820.2019.1707692

Yasui, Y. (2009). A brainwave signal measurement and data processing technique for daily life applications. Journal of Physiological Anthropology, 28 (3), 145–150.

Zheng, Y., Bao, H., Shen, J., & Zhai, X. (2020). Investigating sequence patterns of collaborative problem-solving behavior in online collaborative discussion activity. Sustainability, 12 (20), 8522.

Download references

Acknowledgements

Not applicable.

This paper was supported by the Large-Scale Longitudinal and Cross-Sectional Study of Student Development (2021YFC3340803).

Author information

Xu Du and Lizhao Zhang contributed equally to this work and should be considered co-first authors.

Authors and Affiliations

National Engineering Research Center for E-Learning, Central China Normal University, Wuhan, 430079, China

Xu Du, Hao Li & Yiqian Xie

National Engineering Laboratory for Educational Big Data, Central China Normal University, Wuhan, 430079, China

Lizhao Zhang & Jui-Long Hung

Department of Educational Technology, Boise State University, 1910 University Dr., Boise, ID, 83725, USA

Jui-Long Hung

Department of Educational Studies, University of South Carolina, Charleston, South Carolina, USA

Hengtao Tang

You can also search for this author in PubMed   Google Scholar

Contributions

XD conceptualized and designed the work; writing-reviewing and editing; participated in acquisition, analysis, and interpretation of data. JH designed the work; writing-reviewing and editing; participated in analysis, and interpretation of data. LZ designed the work; writing-original draft preparation and editing; participated in acquisition, analysis, and interpretation of data. HL writing-reviewing and editing; participated in analysis, and interpretation of data. HT writing-reviewing and editing; participated in analysis, and interpretation of data. YX participated in analysis, and interpretation of data. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Jui-Long Hung .

Ethics declarations

Competing interests.

There is no issue related to journal’s policy and no conflicts of any potential competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Du, X., Zhang, L., Hung, JL. et al. Understand group interaction and cognitive state in online collaborative problem solving: leveraging brain-to-brain synchrony data. Int J Educ Technol High Educ 19 , 52 (2022). https://doi.org/10.1186/s41239-022-00356-4

Download citation

Received : 08 March 2022

Accepted : 27 June 2022

Published : 04 October 2022

DOI : https://doi.org/10.1186/s41239-022-00356-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

collaborative problem solving process

Identifying collaborative problem-solver profiles based on collaborative processing time, actions and skills on a computer-based task

  • Published: 30 August 2023
  • Volume 18 , pages 465–488, ( 2023 )

Cite this article

  • Huilin Zhang 1 ,
  • Li Ni 1 &
  • Da Zhou   ORCID: orcid.org/0000-0002-9463-6637 2  

478 Accesses

Explore all metrics

Understanding how individuals collaborate with others is a complex undertaking, because collaborative problem-solving (CPS) is an interactive and dynamic process. We attempt to identify distinct collaborative problem-solver profiles of Chinese 15-year-old students on a computer-based CPS task using process data from the 2015 Program for International Student Assessment (PISA, N  = 1,677), and further to examine how these profiles may relate to student demographics (i.e., gender, socioeconomic status) and motivational characteristics (i.e., achieving motivation, attitudes toward collaboration), as well as CPS performance. The process indicators we used include time-on-task, actions-on-task, and three specific CPS process skills (i.e., establish and maintain shared understanding, take appropriate action to solve the problem, establish and maintain team organization). The results of latent profile analysis indicate four collaborative problem-solver profiles: Disengaged , Struggling , Adaptive, and Excellent . Gender, socioeconomic status, attitudes toward collaboration and CPS performance are shown to be significantly associated with profile membership, yet achieving motivation was not a significant predictor. These findings may contribute to better understanding of the way students interact with computer-based CPS tasks and inform educators of individualized and adaptive instructions to support student collaborative problem-solving.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

collaborative problem solving process

Abdi, B. (2010). Gender differences in social skills, problem behaviours and academic competence of Iranian kindergarten children based on their parent and teacher ratings.  Procedia: Social and Behavioral Sciences, 5, 1175–1179.

Ahonen, A. K., & Harding, S. M. (2018). Assessing online collaborative problem solving among school children in Finland: A case study using ATC21S TM in a national context. International Journal of Learning, Teaching and Educational Research, 17 (2), 138–158. https://doi.org/10.26803/ijlter.17.2.9

Ames, C. (1992). Classrooms: Goals, structures, and student motivation. Journal of Educational Psychology, 84 (3), 261–271. https://doi.org/10.1037/0022-0663.84.3.261

Article   Google Scholar  

Andrews-Todd, J., & Forsyth, C. M. (2020). Exploring social and cognitive dimensions of collaborative problem solving in an open online simulation-based task. Computers in Human Behavior, 104. https://doi.org/10.1016/j.chb.2018.10.025

Avry, S., Chanel, G., Bétrancourt, M., & Molinari, G. (2020). Achievement appraisals, emotions and socio-cognitive processes: How they interplay in collaborative problem-solving? Computers in Human Behavior, 107 . https://doi.org/10.1016/j.chb.2020.106267

Bamaca-Colbert, M. Y., & Gayles, J. G. (2010). Variable-centered and person-centered approaches to studying Mexican-origin mother-daughter cultural orientation dissonance. Journal of Youth Adolescents, 39 (11), 1274–1292. https://doi.org/10.1007/s10964-009-9447-3

Bandura, A. (1977). Social learning theory . Prentice Hall.

Google Scholar  

Caceres, M., Nussbaum, M., Marroquin, M., Gleisner, S., & Marquinez, J. T. (2018). Building arguments: Key to collaborative scaffolding. Interactive Learning Environments, 26 (3), 355–371. https://doi.org/10.1080/10494820.2017.1333010

Care, E., Scoular, C., & Griffin, P. (2016). Assessment of collaborative problem solving in education environments. Applied Measurement in Education, 29 (4), 250–264. https://doi.org/10.1080/08957347.2016.1209204

Chung, G., O’Neil, H., Jr., & Herl, H. (1999). The use of computer-based collaborative knowledge mapping to measure team process and team outcomes. Computers in Human Behavior, 15 (3–4), 463–493.

Conger, R., & Donnellan, M. (2007). An interactionist perspective on the socioeconomic context of human development. Annual Review of Psychology, 58 (1), 175–199.

Cowden, R. G., Mascret, N., & Duckett, T. R. (2021). A person-centered approach to achievement goal orientations in competitive tennis players: Associations with motivation and mental toughness. Journal of Sport and Health Science, 10 (1), 73–81.

Cukurova, M., Luckin, R., Millán, E., & Mavrikis, M. (2018). The NISPI framework: Analyzing collaborative problem-solving from students’ physical interactions. Computers and Education, 116 , 93–109.

De Boeck, P., & Scalise, K. (2019). Collaborative problem solving: Processing actions, time, and performance. Frontiers in Psychology, 10 , 1280. https://doi.org/10.3389/fpsyg.2019.01280

Delton, A., Cosmides, L., Guemo, M., Robertson, T., & Tooby, J. (2012). The psychosemantics of free riding: Dissecting the architecture of a moral concept. Journal of Personality and Social Psychology, 102 (6), 1252.

Dindar, M., Järvelä, S., & Järvenoja, H. (2020). Interplay of metacognitive experiences and performance in collaborative problem solving. Computers and Education, 154 . https://doi.org/10.1016/j.compedu.2020.103922

Doise, W., & Mugny, G. (1984). The social development of the intellect . Pergamon Press.

Dowell, N., Lin, Y., Godfrey, A., & Brooks, C. (2020). Exploring the relationship between emergent socio-cognitive roles, collaborative problem-solving skills and outcomes: A group communication analysis. Journal of Learning Analytics, 7 (1). https://doi.org/10.18608/jla.2020.71.4

Du, X., Zhang, L., Hung, J. L., Li, H., Tang, H., & Xie, Y. (2022). Understand group interaction and cognitive state in online collaborative problem solving: Leveraging brain-to-brain synchrony data. International Journal of Educational Technology in Higher Education, 19 (1). https://doi.org/10.1186/s41239-022-00356-4

Eichmann, B., Goldhammer, F., Greiff, S., Brandhuber, L., & Naumann, J. (2020). Using process data to explain group differences in complex problem solving. Journal of Educational Psychology, 112 (8), 1546–1562. https://doi.org/10.1037/edu0000446

Emerson, T. L. N., English, L. K., & McGoldrick, K. (2015). Evaluating the cooperative component in cooperative learning: A quasi-experimental study. Journal of Economic Education, 46 (1), 1–13. https://doi.org/10.1080/00220485.2014.978923

Emmen, R., Malda, M., Mesman, J., Van Ijzendoorn, M., Prevoo, M., & Yeniad, N. (2013). Socioeconomic status and parenting in ethnic minority families: Testing a minority family stress model. Journal of Family Psychology, 27 (6), 896–904.

Ferguson-Patrick, K. (2020). Cooperative learning in Swedish classrooms: Engagement and relationships as a focus for culturally diverse students. Education Sciences, 10 (11). https://doi.org/10.3390/educsci10110312

Gao, Q., Zhang, S., Cai, Z., Liu, K., Hui, N., & Tong, M. (2022). Understanding student teachers’ collaborative problem-solving competency: Insights from process data and multidimensional item response theory. Thinking Skills and Creativity, 45. https://doi.org/10.1016/j.tsc.2022.101097

Greiff, S., Molnár, G., Martin, R., Zimmermann, J., & Csapó, B. (2018). Students’ exploration strategies in computer-simulated complex problem environments: A latent class approach. Computers and Education, 126 , 248–263. https://doi.org/10.1016/j.compedu.2018.07.013

Gu, X., & Cai, H. (2019). How a semantic diagram tool influences transaction costs during collaborative problem solving. Journal of Computer Assisted Learning, 35 (1), 23–33. https://doi.org/10.1111/jcal.12307

Haataja, E., Malmberg, J., Dindar, M., & Jarvela, S. (2022). The pivotal role of monitoring for collaborative problem solving seen in interaction, performance, and interpersonal physiology. Metacognition and Learning, 17 (1), 241–268. https://doi.org/10.1007/s11409-021-09279-3

Hajovsky, D., Caemmerer, J., & Mason, B. (2022). Gender differences in children’s social skills growth trajectories. Applied Developmental Science, 26 (3), 488–503.

Hänze, M., & Berger, R. (2007). Cooperative learning, motivational effects, and student characteristics: An experimental study comparing cooperative learning and direct instruction in 12th grade physics classes. Learning and Instruction, 17 (1), 29–41. https://doi.org/10.1016/j.learninstruc.2006.11.004

Hao, J., Liu, L., von Davier, A., Kyllonen, P. C., & Kitchen, C. (2016). Collaborative problem solving skills versus collaboration outcomes: Findings from statistical analysis and data mining. EDM.

Herborn, K., Mustafic, M., & Greiff, S. (2017). Mapping an experiment-based assessment of collaborative behavior onto collaborative problem solving in PISA 2015: A cluster analysis approach for collaborator profiles. Journal of Educational Measurement, 54 (1), 103–122. https://doi.org/10.1111/jedm.12135

Herborn, K., Stadler, M., Mustafić, M., & Greiff, S. (2020). The assessment of collaborative problem solving in PISA 2015: Can computer agents replace humans? Computers in Human Behavior, 104, 105624. https://doi.org/10.1016/j.chb.2018.07.035

Howard, C., Di Eugenio, B., Jordan, P., & Katz, S. (2017). Exploring initiative as a signal of knowledge co-construction during collaborative problem solving. Cognitive Science, 41 (6), 1422–1449. https://doi.org/10.1111/cogs.12415

Jacobs, G. M., & Goh, C. C. M. (2007). Cooperative learning in the language classroom . SEAMEO Regional Language Centre.

Johnson, D. W., & Johnson, R. T. (1989). Cooperation and competition: Theory and research . Interaction Book Company.

Johnson, D. W., & Johnson, R. T. (2003). Assessing students in groups: Promoting group responsibility and individual accountability . Corwin Press.

Kim, J.-I., Kim, M., & Svinicki, M. D. (2012). Situating students’ motivation in cooperative learning contexts: Proposing different levels of goal orientations. Journal of Experimental Education, 80 (4), 352–385. https://doi.org/10.1080/00220973.2011.625996

Li, C. H., & Liu, Z. Y. (2017). Collaborative problem-solving behavior of 15-year-old Taiwanese students in science education. EURASIA Journal of Mathematics, Science and Technology Education, 13 (10). https://doi.org/10.12973/ejmste/78189

Li, S., Pöysä-Tarhonen, J., & Häkkinen, P. (2022). Patterns of action transitions in online collaborative problem solving: A network analysis approach. International Journal of Computer-Based Collaborative. Learning, 17 (2), 191–223. https://doi.org/10.1007/s11412-022-09369-7

Ma, Y. (2021). A cross-cultural study of student self-efficacy profiles and the associated predictors and outcomes using a multigroup latent profile analysis. Studies in Educational Evaluation, 71, 101071

Ma, Y. (2022). Profiles of student attitudes toward science and its associations with gender and academic achievement. International Journal of Science Education, 1-20.

Ma, Y., & Corter, J. (2019). The effect of manipulating group task orientation and support for innovation on collaborative creativity in an educational setting. Thinking Skills and Creativity, 33, 100587.

Maltz, D. N., & Borker, R. A. (1982). A cultural approach to male-female miscommunication. In J. J. Gumperz (Ed.), Language and social identity (pp. 195–216). Cambridge University Press.

Meyer, J. P., & Morin, A. J. (2016). A person-centered approach to commitment research: Theory, research, and methodology. Journal of Organizational Behavior, 37 (4), 584–612.

Muthén, L., & Muthén, B. (2019). Mplus user’s guide (1998–2019) . Muthén & Muthén.

Nichols, J. D., & Miller, R. B. (1994). Cooperative learning and student motivation. Contemporary Educational Psychology, 19 (2), 167–178. https://doi.org/10.1006/ceps.1994.1015

OECD. (2017). PISA 2015 Results (Volumn V): Collaborative Problem Solving . PISA, OECD Publishing https://doi.org/10.1787/9789264285521-en

Petty, R., Harkins, S., Williams, K., & Latane, B. (1977). The effects of group size on cognitive effort and evaluation. Personality and Social Psychology Bulletin, 3 (4), 579–582.

Reilly, J. M., & Schneider, B. (2019). Predicting the quality of collaborative problem solving through linguistic analysis of discourse . International Educational Data Mining Society.

Rosen, Y., Wolf, I., & Stoeffler, K. (2020). Fostering collaborative problem solving skills in science: The Animalia project. Computers in Human Behavior, 104. https://doi.org/10.1016/j.chb.2019.02.018

Rummel, N., Mullins, D., & Spada, H. (2012). Scripted collaborative learning with the cognitive tutor algebra. International Journal of Computer-Supported Collaborative Learning, 7 , 307–339.

Scherer, R., & Gustafsson, J.-E. (2015). The relations among openness, perseverance, and performance in creative problem solving: A substantive-methodological approach. Thinking Skills and Creativity, 18 , 4–17. https://doi.org/10.1016/j.tsc.2015.04.004

Schindler, M., & Bakker, A. (2020). Affective field during collaborative problem posing and problem solving: A case study. Educational Studies in Mathematics, 105 (3), 303–324. https://doi.org/10.1007/s10649-020-09973-0

Slavin, R. E. (1987). Ability grouping and student achievement in elementary schools: A best evidence synthesis. Review of Educational Research, 57 , 293–336.

Spurk, D., Hirschi, A., Wang, M., Valero, D., & Kauffeld, S. (2020). Latent profile analysis: A review and “how to” guide of its application within vocational behavior research. Journal of Vocational Behavior, 120 , 103445.

Stadler, M., Herborn, K., Mustafic, M., & Greiff, S. (2019). Computer-based collaborative problem solving in PISA 2015 and the role of personality. Journal of Intelligence, 7 (3). https://doi.org/10.3390/jintelligence7030015

Stoeffler, K., Rosen, Y., Bolsinova, M., & von Davier, A. A. (2020). Gamified performance assessment of collaborative problem-solving skills. Computers in Human Behavior, 104 . https://doi.org/10.1016/j.chb.2019.05.033

Summers, J. J., Beretvas, S. N., Svinicki, M. D., & Gorin, J. S. (2005). Evaluating collaborative learning and community. Journal of Experimental Education, 73 (3), 165–188. https://doi.org/10.3200/jexe.73.3.165-188

Sun, C., Shute, V. J., Stewart, A., Yonehiro, J., Duran, N., & D’Mello, S. (2020). Towards a generalized competency model of collaborative problem solving. Computers and Education, 143 . https://doi.org/10.1016/j.compedu.2019.103672

Sun, C., Shute, V. J., Stewart, A. E. B., Beck-White, Q., Reinhardt, C. R., Zhou, G. J., Duran, N., & D’Mello, S. K. (2022). The relationship between collaborative problem solving behaviors and solution outcomes in a game-based learning environment. Computers in Human Behavior, 128, 14, Article 107120. https://doi.org/10.1016/j.chb.2021.107120

Tang, P., Liu, H., & Wen, H. (2021). Factors predicting collaborative problem solving: Based on the data from PISA 2015. Frontiers in Education, 6. https://doi.org/10.3389/feduc.2021.619450

Teig, N., Scherer, R., & Kjærnsli, M. (2020). Identifying patterns of students’ performance on simulated inquiry tasks using PISA 2015 log-file data. Journal of Research in Science Teaching, 57 (9), 1400–1429.

Unal, E., & Cakir, H. (2021). The effect of technology-supported collaborative problem solving method on students’ achievement and engagement. Education and Information Technologies, 26 (4), 4127–4150. https://doi.org/10.1007/s10639-021-10463-w

von Davier, A. A. (2017). Computational psychometrics in support of collaborative educational assessments. Journal of Educational Measurement, 54 (1), 3–11. https://doi.org/10.1111/jedm.12129

von Davier, M., Gonzalez, E., & Mislevy, R. (2009). What are plausible values and why are they useful. IERI Monograph Series, 2 (1), 9–36.

Vygotsky, L. (1978). Interaction between learning and development. Readings on the Development of Children, 23 (3), 34–41.

Wang. (2018). Collaborative problem-solving performance and its influential factors of 15-year-old students in four provinces of China- Based on the PISA 2015 dataset. Research in Educational Development, 38 (10), 60–68.

Webb, N. M. (1982). Peer interaction and learning in cooperative small groups. Journal of Educational Psychology, 74 (5), 642.

Weiner, B. (1972). Attribution theory, achievement motivation, and the educational process. Review of Educational Research, 42 (2), 203–215.

Wu, Z., Hu, B., Wu, H., Winsler, A., & Chen, L. (2020). Family socioeconomic status and Chinese preschoolers’ social skills: Examining underlying family processes. Journal of Family Psychology, 34 (8), 969–979.

Xu, S. H., & Li, M. J. (2019). Analysis on the student performance and influencing factors in the PISA 2015 collaborative problem-solving assessment: A case study of B-S-J-G (China). Educational Approach, 277 , 9–16.

Zheng, Y., Bao, H., Shen, J., & Zhai, X. (2020). Investigating sequence patterns of collaborative problem-solving behavior in online collaborative discussion activity. Sustainability (Basel, Switzerland), 12 (20), 8522.

Download references

This work was supported and sponsored by Shanghai Pujiang Program [Grant Number 22PJC059].

Author information

Authors and affiliations.

School of Education, Shanghai Jiao Tong University, No. 800 Dongchuan Road, Minhang District, Shanghai, 200240, China

Yue Ma, Huilin Zhang & Li Ni

Faculty of Education, Northeast Normal University, No. 5268 Renmin Street, Nanguan District, Changchun, 130024, Jilin Province, China

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Da Zhou .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests that could have appeared to influence the work reported in this paper.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix PISA 2015 CPS sample unit: Xandar

OECD ( 2017 ), PISA 2015 Results (Volume V): Collaborative Problem Solving, PISA, OECD Publishing, Paris. http://dx.doi.org/10.1787/9789264285521-en

The detailed information of the PISA 2015 CPS sample unit, Xandar, can be found at https://www.oecd.org/pisa/test/CPS-Xandar-scoring-guide.pdf .

The unit consists of four independent parts; all parts and all items within each part are independent of one another. No matter which response a student selects for a particular item, the computer agents respond in a way so that the unit converges. All students are hence faced with an identical version of the next item.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Ma, Y., Zhang, H., Ni, L. et al. Identifying collaborative problem-solver profiles based on collaborative processing time, actions and skills on a computer-based task. Intern. J. Comput.-Support. Collab. Learn 18 , 465–488 (2023). https://doi.org/10.1007/s11412-023-09400-5

Download citation

Received : 15 November 2022

Accepted : 18 May 2023

Published : 30 August 2023

Issue Date : December 2023

DOI : https://doi.org/10.1007/s11412-023-09400-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Collaborative problem-solving
  • Collaborative process
  • Computer-based task
  • Latent profile analysis
  • Find a journal
  • Publish with us
  • Track your research

Logo

  • Collaborative Problem Solving, A Talk with Dr. Stuart Ablon »
  • FOR CLINICIANS
  • FOR EDUCATORS
  • FOR PARENTS

collaborative problem solving process

Recent Articles

  • Corporal Punishment Ban in New York Sparks Awareness of Practice
  • Kids Lack Skill, Not Will
  • The Village Network's Implementation Journey
  • To Fix Students’ Bad Behavior, Stop Punishing Them
  • Behaviors Charts: Helpful or Harmful?
  • Seeing red? 4 Steps to Try Before Responding
  • COLLABORATIVE PROBLEM SOLVING

Collaborative Problem Solving, A Talk with Dr. Stuart Ablon

A flawless foundation #flawlesstalk.

In this presentation at the Churchill School, sponsored by The Flawless Foundation, Dr. J. Stuart Ablon describes what causes challenging behavior and the Collaborative Problem Solving ® approach.

Highlights include:

  • What consequences do, and don't do
  • Collaborative Problem Solving is trauma-informed
  • What is discipline
  • Research on skills deficits
  • Planning an intervention using Collaborative Problem Solving

Collaborative Problem Solving, presented by The Flawless Foundation

Most popular, privacy overview, subscribe to our newsletter.

Collaborative Problem Solving

Collaborative Problem Solving

This course will help you develop the collaborative problem-solving skills you need to succeed in virtually any work environment while focusing on the importance and many benefits of working in teams.

Requirements

The Collaborative Problem Solving course will help you become familiar with the basics of working in teams and why teamwork is important to our professional and personal success. This course will provide you with essential strategies for solving problems and challenges that arise during collaboration, and provide you with ways to move forward toward achieving a common goal.

Enrollment Options:

  • Requires consensus
  • Involves groups or teams to make decisions
  • Identifying the problem and the solution
  • Identifying the leadership of the group
  • Identifying the goal of the group
  • Collaborative problem-solving is used to maximize productivity, minimize expenses, resolving conflict, improving morale, and integrating departments
  • Using technology in the process
  • Best practices for managing the collaborative problem-solving process

Prerequisites:

There are no prerequisites to take this course.

Requirements:

Hardware Requirements:

  • This course can be taken on either a PC or Mac.

Software Requirements:

  • PC: Windows 8 or later.
  • Mac: macOS 10.6 or later.
  • Browser: The latest version of Google Chrome or Mozilla Firefox are preferred. Microsoft Edge and Safari are also compatible.
  • Adobe Acrobat Reader .
  • Software must be installed and fully operational before the course begins.
  • Email capabilities and access to a personal email account.

Instructional Material Requirements:

The instructional materials required for this course are included in enrollment and will be available online.

IMAGES

  1. Collaborative Problem-Solving Steps

    collaborative problem solving process

  2. collaborative problem solving strategies

    collaborative problem solving process

  3. Collaborative Problem-Solving Process

    collaborative problem solving process

  4. Collaborative Problem-Solving Process.pdf

    collaborative problem solving process

  5. Group Problem Solving

    collaborative problem solving process

  6. Creative Problem Solving

    collaborative problem solving process

VIDEO

  1. MODULE 3 Collaborative Problem Solving Role Play

  2. #ThinkMoment: Make a List

  3. Problem-solving Process that focuses on Confronting and Resolving the Conflict (INPM5112

  4. What is a Hackathon and How Does It Encourage Collaborative Problem Solving?

  5. Collaborative problem-solving, globally

  6. #73 Quantitative Reasoning Exercise Bank example

COMMENTS

  1. Collaborative Problem Solving: The Ultimate Guide

    Therefore, collaborative problem solving (CPS) is essentially solving problems by working together as a team. While problems can and are solved individually, CPS often brings about the best resolution to a problem while also developing a team atmosphere and encouraging creative thinking. How to solve problems as a team

  2. Think:Kids : Collaborative Problem Solving®

    What is Collaborative Problem Solving? Kids with challenging behavior are tragically misunderstood and mistreated. Rewards and punishments don't work and often make things worse. Thankfully, there's another way. But it requires a big shift in mindset.

  3. PDF Collaborative Problem Solving

    Chapter 1: Executive Summary 2 Defining Collaborative Problem Solving The term "collaboration" has different meanings in different environments. In K-12, collaboration almost always means an individual task can be solved by anyone in the group, but collaboration is also an instructional strategy to enable learning more efficiently or effectively.

  4. The effectiveness of collaborative problem solving in promoting

    Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses...

  5. Stages of the Collaborative Problem Solving Process

    Problem Solving Process A typical collaborative process has three well-defined stages, each containing a number of steps, tasks or objectives Getting Started (Stage One) Pre Deliberation Initiate the process Assess issues and stakeholders Design a strategy Set up a program Searching for Agreement (State Two) Deliberation

  6. Understanding student teachers' collaborative problem solving

    In view of the growing awareness of measuring collaborative problem solving competency from process data, and the lack of focus of the previous studies on the composition of collaborative problem solving competency and its relationship with learning performance in real human-human interaction scenarios, this study intended to fill this gap by focusing on the compositions of student teachers ...

  7. How to ace collaborative problem solving

    To solve any problem—whether personal (eg, deciding where to live), business-related (eg, raising product prices), or societal (eg, reversing the obesity epidemic)—it's crucial to first define the problem.

  8. PDF Pisa 2015 Collaborative Problem-solving Framework July 2017

    1. Collaboration has been defined as a "co-ordinated, synchronous activity that is the result of a continued attempt to construct and maintain a shared conception of a problem" (Roschelle and Teasley, 1995, p. 70).

  9. What Is Collaborative Problem Solving and Why Use the Approach?

    4 Citations Part of the Current Clinical Psychiatry book series (CCPSY) Abstract This chapter orients or reorients the reader to the fundamental philosophy and practice of Collaborative Problem Solving.

  10. Collaborative Problem Solving

    The PISA 2015 Collaborative Problem Solving assessment measures students' capacity to effectively engage in a process whereby two or more agents attempt to solve a problem by sharing the understanding and effort required to come to a solution, and pooling their knowledge, skills and efforts to reach that solution. About

  11. PDF Collaborative Problem Solving: Steps in the Process

    COLLABORATIVE PROBLEM SOLVING: STEPS IN THE PROCESS IN THIS SECTION YOU WILL FIND: COLLABORATIVE PROBLEM SOLVING VS. BEING POSITIONAL THOUGHTS ABOUT PREPARATION Figure Out Your Interests Figure Out Their Interests Consider Some Options Whatís a Fair Standard? Keep an Open Mind STEPS IN THE COLLABORATIVE PROCESS SHARE PERSPECTIVES Perception

  12. Section 11. Collaborative Leadership

    A collaboration among several groups and individuals is often needed to address a complex issue, and collaboration requires collaborative leadership. Collaborative leadership means maintaining a process that includes everyone involved in an issue or organization. A process that depends on collaborative problem solving and decision making.

  13. Collaborative Problem Solving

    Collaborative Problem-Solving in Schools outlines a process to help veteran and new leaders alike to create thoughtful, organized, and collaborative solutions for the simple to the most difficult problems they face. Rooted in theory, this comprehensive guide presents a seven-step process that addresses all types of problems.

  14. Full article: Measuring collaborative problem solving: research agenda

    We outline our ideas on potential ways to improve (1) generalizability in Human-Human assessment tools and ecological validity in Human-Agent ones; (2) flexible and convenient use of restricted communication options; and (3) an evaluation system of both Human-Human and Human-Agent instruments.

  15. Collaborative Problem Solving: A Resource Guide for Counselors

    Collaborative Problem Solving (CPS) is an evidence-based approach that focuses on understanding and addressing the root causes of challenging behavior in children and adolescents.

  16. Collaborative Problem Solving: Examples & Techniques

    Collaborative Problem Solving (CPS) is a process of civil argumentation wherein two or more parties negotiate agreeably to have conflicting needs met.

  17. Collaborative Problem-Solving Steps

    In general, these are the essential steps of "solving" open-ended, complex problems: (problems with correct answers are best assigned to individuals, not groups.) Identify the problem (explain its significance/how it harms society, define its scope, etc.) Gather information and research to increase understanding of the problem.

  18. Exploring collaborative problem solving in virtual laboratories: a

    The Organization for Economic Co-operation and Development (OECD, 2017) defines collaborative problem solving as "the capacity of an individual to effectively engage in a process whereby two or more agents attempt to solve a problem by sharing the understanding and effort required to come to a solution and pooling their knowledge, skills, and ...

  19. Collaborative Problem Solving: Processing Actions, Time, and

    The parts require the respondent to collaboratively plan a process for problem solving, implement the process, reach a solution, and evaluate the solution (For a full description, see the Materials and Methods section, "Parts of the Xandar Task.")

  20. Understand group interaction and cognitive state in online

    Collaborative problem solving (CPS) Collaborative problem solving is defined as "the capacity of an individual to effectively engage in a process whereby two or more agents attempt to solve a problem by sharing the understanding and effort required to come to a solution and pooling their knowledge, skills, and efforts to reach that solution" (OECD, 2017, p. 26).

  21. PDF Collaborative Problem-Solving Process in A Science Serious Game ...

    Collaborative problem-solving, learning process, serious game, Jaccard coefficient, KmL cluster analysis, science learning. 1. INTRODUCTION Research has highlighted a need for a comprehensive understanding of collaborative problem-solving (CPS), which is regarded as one of the critical competencies of the 21st century skills [10].

  22. Identifying collaborative problem-solver profiles based on ...

    Understanding how individuals collaborate with others is a complex undertaking, because collaborative problem-solving (CPS) is an interactive and dynamic process. We attempt to identify distinct collaborative problem-solver profiles of Chinese 15-year-old students on a computer-based CPS task using process data from the 2015 Program for International Student Assessment (PISA, N = 1,677), and ...

  23. Collaborative Problem Solving, A Talk with Dr. Stuart Ablon

    A Flawless Foundation #FlawlessTalk. In this presentation at the Churchill School, sponsored by The Flawless Foundation, Dr. J. Stuart Ablon describes what causes challenging behavior and the Collaborative Problem Solving ® approach. Highlights include: What consequences do, and don't do. Collaborative Problem Solving is trauma-informed.

  24. Collaborative Problem Solving

    The Collaborative Problem Solving course will help you become familiar with the basics of working in teams and why teamwork is important to our professional and personal success. This course will provide you with essential strategies for solving problems and challenges that arise during collaboration, and provide you with ways to move forward ...