Join Our Newsletter

Join our subscribers list to get the latest news, updates and special offers directly in your inbox

Zimbabwe’s Early Childhood Development (ECD)

21st    century   skills   early   learning   frameworks   prioritize the integration of 21st  century skills (critical thinking, problem-solving, collaboration, communication, creativity technology literacy, and social and emotional development) in early learning experiences (zimcurriculum).

Zimbabwe’s Early Childhood Development (ECD)

  Zimbabwe’s Early Childhood Development (ECD)  

  • Background to the Zimbabwe Education System

Zimbabwe Education System at a Glance

Zimbabwe’s ecd program, challenges  ,  ecd: current nutrition-related programs, healthy lifestyle through the new competence-based curriculum, healthy lifestyle competence-based curriculum, school health policy, school health policy, homegrown school feeding program, homegrown school feeding program, implementation of competence-based curriculum.

  • Challenges of Zimbabwe ECD Program
  • Zimbabwe’s Early Childhood Development (ECD)
  • Zimbabwe ECD Program
  • Zimbabwe ECD Food Program
  • ECD is now compulsory
  • It is now 9 years of primary education for kids
  • zimbabwe ecd education
  • zimsec ecd education
  • ecd zimbabwe education

What's Your Reaction?

like

Related Posts

Importance of using  ZimSec Past Exam Papers for exam preparations

Importance of using ZimSec Past Exam Papers for exam preparations

ZIMSEC O level : Government warns schools withholding students ZIMSEC results

ZIMSEC O level : Government warns schools withholding students ZIMSEC ...

Does ZIMSEC High passes affect exam credibility?

Does ZIMSEC High passes affect exam credibility?

Popular posts.

O Level Revision : Shona - Zvirungamutauro - Tsumo

O Level Revision : Shona - Zvirungamutauro - Tsumo

zimsake   Apr 22, 2021  1  12454

English Language Past Exam Paper 1 with answers for Zimsec O level June 2019 .pdf

English Language Past Exam Paper 1 with answers for Zim...

zimsake   Jul 6, 2023  1  9011

O Level Revision : Shona- Rondedzero - Manyorerwo erondedzero

O Level Revision : Shona- Rondedzero - Manyorerwo erond...

zimsake   Apr 22, 2021  0  8170

Zimsec Grade 7 Question and Answer General Paper Booklet .pdf

Zimsec Grade 7 Question and Answer General Paper Bookle...

zimsake   Jul 6, 2023  0  7072

O Level Revision : Shona - Zvirungamutauro - Madimikira

O Level Revision : Shona - Zvirungamutauro - Madimikira

zimsake   Apr 22, 2021  3  5951

O Level Revision :  Combined Science - Plant Nutrition

O Level Revision : Combined Science - Plant Nutrition

zimsake   Aug 27, 2023  2  4539

O Level Revision :  Combined Science - Science in Structures and Mechanical Systems  - Machines

O Level Revision : Combined Science - Science in Struc...

zimsake   Aug 7, 2023  1  4113

O Level Revision :  Combined Science - Science in Industry - Industrial Processes

O Level Revision : Combined Science - Science in Indus...

zimsake   Aug 7, 2023  0  2622

O Level Revision :  O Level Revision :  Combined Science - Science in Structures and Mechanical Systems - Machines

O Level Revision : O Level Revision : Combined Scienc...

zimsake   Aug 7, 2023  0  3207

O Level Revision :  Combined Science - Science in Industry  - Metals

zimsake   Aug 7, 2023  0  1653

  • O level Notes (340)
  • O level Sociology (0)
  • O level Agriculture (40)
  • O level Geography (32)
  • O level Commerce (18)
  • O level History (27)
  • O level Family Religious Studies (35)
  • O level Food Technology Design (23)
  • O level Integrated Science (19)
  • O level Combined Science (18)
  • O level Computer Science (0)
  • O level English (0)
  • O level Maths (0)
  • O level Shona (29)
  • O level Accounts (22)
  • O level Physical Education PE (0)
  • O level Biology (30)
  • O level Physics (24)
  • O level Chemistry (23)
  • A Level Notes (0)
  • PAST EXAM PAPERS (1311)
  • Text Books (1196)
  • Schools (9159)
  • Primary Schools In Zimbabwe (6288)
  • Secondary Schools In Zimbabwe (2871)
  • Primary Schools In S.A (0)
  • Secondary Schools In S.A (0)
  • Combined Schools In S.A (0)

Random Posts

fixer

Mkwasha (Primary School)

Zimbabwe Primary School Zimsec Grade 3 English Exam Paper 2 Set 1-Docx

Zimbabwe Primary School Zimsec Grade 3 English Exam Paper 2 Set 1-Docx

Primary School Grade 2 Reading For Group C Week 1 And 2-Pdf

Primary School Grade 2 Reading For Group C Week 1 And 2-Pdf

Elite Christian College (Primary School)

Elite Christian College (Primary School)

Chingono (Primary School)

Chingono (Primary School)

  • Chisawu Primary School
  • Mataruse Sec Secondary School Masvingo
  • Madamombe Primary School Mashonaland East
  • Home Economics Books
  • Umlala School
  • Mazonwe High School
  • Ralph Primary School Mashonaland East
  • Red blood cell
  • St. Mary Primary School Matabeleland North
  • St Vitalis Primary School
  • A & J Pre School Primary School
  • Glengrey Primary School Mashonaland Central
  • Mondo Primary Primary School
  • Mutanda Primary Primary School Masvingo
  • Guvakuva Secondary School Mashonaland East

ECD Module 3

Unit 4 online study guide, learning unit 4: reflect on the ecd programme.

  After completing this learning unit, you will:

  • Have reflected on the ECD programme
  • Be able to evaluate the design of activities. 

Reflect on the ECD programme

Reflecting on your ECD programmes is an excellent way to secure that you will always strive towards improvement. Bear in mind that reflections may take place before, during and after implementation of your programme.

Allowing for space, incentive, time, and means for ECD practitioners to reflect on their programmes and practice are all essential for an effective ECD centre. Without feedback and reflection time, ECD practitioners sometimes work mechanically (like they are on auto pilot) and without objectives.

Reflection allows you to build on the strengths of your programme, and to resolve weaknesses and problem areas. This means that the programme is always evolving, strengthening and improving. When you see the positive impact of your programme reflections, you should feel proud and resolve to carry on with renewed energy. Furthermore, the process of self-reflection can also become a vehicle for collaboration.

Make self-reflection a priority, and create space for it. Your own success and that of the ECD depend on your reflection.  

4.1 Obtain feedback from relevant sources on the value and success of the programme

When we want to reflect on the value and success of our EC programme, we have to start by obtaining feedback from relevant stakeholders (the sources of feedback). Let's begin by examining a mind map of who these stakeholders or feedback sources could be. 

critical thinking and problem solving skills used in literacy program in ecd

a.  Mentors, colleagues and other ECD practitioners Your mentor or your colleagues in the ECD can give you feedback and help you to evaluate how well you are preparing and implementing programmes. You may want to set up regular feedback sessions while you are still building up experience as an ECD practitioner. Later, you could help to mentor or support other new practitioners. A peer feedback group also enables you to obtain relevant feedback from your peers.

b.  ECD principal The ECD centre director or principal will be a highly experienced practitioner, so she will be an excellent feedback source. She will also probably have a bird’s eye view of your playgroup so may see trends and possibilities more easily.

  c.  Parents There are many ways to obtain feedback from parents:

  • Informal conversation when dropping or collecting children
  • Telephone calls
  • Home visits
  • Parent meetings
  • Questionnaires
  • A notebook in the child’s bag that the ECD practitioner and parents use to correspond with each other

Usually, the ECD practitioner will use all of these methods to obtain meaningful feedback, both positive and negative. Remember: you are most likely to receive the best feedback during informal conversations with parents when they are feeling relaxed. Note down relevant suggestions and where appropriate discuss them further at a parent meeting or through a questionnaire.

critical thinking and problem solving skills used in literacy program in ecd

  e.  Grade R or Foundation Phase teachers Your ECD centre will probably be a feeder centre for one or two local primary schools. Build up a relationship with the Grade R Foundation Phase teachers. They will be in a good position to give feedback on your programme, as they can assess the children’s learning and development when they enter Grade R.

f.  Community members or organisations People outside the immediate ECD centre staff and the children’s families also have an impact on the success of the ECD centre. You can informally ask these individuals for feedback. If what they have to say is really relevant, you may invite them to a parent meeting and ask them to address your colleagues and the children’s parents.  

The feedback book: Have a feedback book on hand at the ECD centre. This can either be your personal feedback book or you can create a book for all the ECD practitioners to use. Write down feedback as you receive it, otherwise you might forget before you have a chance to use a suggestion or share a compliment or complaint with your colleagues. You could also use a comments box or feedback book for those parents who want to remain anonymous. As trust grows, parents will feel more comfortable about speaking directly to you.

  4.1.1 Obtain feedback on the application of the activities

When you conduct evaluations, you collect and examine evidence in order to make judgements about something’s value. An evaluation is basically making value judgements about something by looking at the advantages and disadvantages of that thing. A value judgement is how effective or ineffective, good or bad, successful or unsuccessful something is. When you evaluate something, you want to assess whether it meets its intended goals. So, how does this relate to the evaluation of activities in an ECD playroom? You will have to make value judgements about your activities, their strengths and weaknesses and whether they meet their intended goals.

Why do you need to evaluate the activities you design? Because an evaluation allows you to assess your current teaching practice and improve your future teaching practice. Evaluation gives you the opportunity to:

  • Learn from your mistakes
  • Identify your areas of strengths and your areas of challenge
  • Make sure that you are being effective
  • Check how appropriate things are
  • Ensure that what you do matches the purpose
  • Identify ways in which you can change and grow Evaluation forms an important part of your ongoing professional growth and development as an ECD practitioner.

Evaluations of activities in an ECD playroom have to meet a number of criteria. The evaluations you conduct must:

  • Reveal the activities” strengths and weaknesses in relation to their purpose
  • Be consistent and systematic
  • Draw on feedback, observations and/or reflections
  • Assess the contribution of the activities to the ECD”s aims

Let’s look at each one of these in more detail.  

a.  Evaluations must reveal the strengths and weaknesses of activities in relation to their purpose When you design your activities, they always have a purpose which is linked to the developmental outcomes. For example, the purpose might be to develop fine motor skills or to encourage children to share. The purpose is the intended goal of the activity. So, one of the things that you need to evaluate is whether an activity achieved its purpose. You can assess the strengths and weaknesses of an activity in relation to its purpose. The strengths will be successes, the advantages, the effective parts of an activity, what worked. The weaknesses will be the aspects that were not successful, the disadvantages, and the ineffective parts of an activity, what didn’t work.

You can learn from both the strengths and weaknesses of an activity. The strengths of an activity teach you what is effective so that you can use those aspects again. The weaknesses provide you with opportunities to change what didn’t work so you can improve next time you do that activity or when you design another activity.

b.  Evaluations must be consistent and systematic A consistent evaluation is one that always looks at the same criteria. A systematic evaluation is one that is orderly and well planned. So, how can you make your evaluations consistent and systematic? A useful way is to develop a checklist of the criteria that you need to evaluate. Then you can be sure that you are always assessing the same criteria in an orderly way.

The activity plan that you used to describe your activities can form the basis for the checklist of the criteria that you need to evaluate. The table below gives you a checklist based on the Activity Plan:

critical thinking and problem solving skills used in literacy program in ecd

The comments/evidence column is for comments explaining the rating or evidence to justify the rating. Evidence can be any proof that supports a rating such as a child’s comment, an observation about a resource that broke or the products that the children produce in an activity. The Comments/Evidence column is there to avoid rather meaningless generalisations about activities "going well" or "being a disaster". An evaluation must be specific about how and why an activity is being judged as effective or ineffective.

  c.  Evaluations must draw on observation, reflection or feedback When you conduct an evaluation, you can use three ways to gather information about the value and success of an activity: These three ways are:

  • Observation
  • Feedback.  

Observation You learned about doing observations when you were evaluating the learning resources you adapted. To recap, observations are about watching the children doing the activity. During an observation you are watching carefully to see whether the purpose of the activity is being achieved and to assess the strengths and weaknesses of the activity. Observation needs to be about what was said and done during the activity. The observation is a good time to complete the checklist. The checklist will help to record the observations and allows space to note down examples to illustrate comments.

Observation is part of an ongoing cycle: you observe the children during the activity, record your observations, evaluate the activity and then you use these evaluation comments to help you to improve your activity and design new activities. Then you implement (do) your improved activity or your new activity and the cycle starts again. In this way, your observation and evaluation inform your practice (that means that your practice is based on your observation and evaluation).  

critical thinking and problem solving skills used in literacy program in ecd

Feedback Feedback is information about a performance that leads to action to affirm or develop performance. The "performance" in this case is the activity. So, how is feedback different from reflection? Both are based on observation but reflection is usually self-reflection, it is what you think and feel about the activity. Feedback usually comes from other people such as your colleagues, the children in you playgroup and their parents. In a way, you could say that reflection is your own feedback on an activity and feedback is other people’s reflections on an activity.

Feedback gives you the information you need to reinforce the effective things you are doing and to identify areas where you can improve. When you get feedback, it should motivate you to improve yourself as an ECD practitioner. Feedback is an essential part of your own learning. It helps you to maximise your potential, raise your awareness of your strengths and weaknesses, and identify actions you can take to improve your performance. Feedback helps you to plan productively for the next activity.

Feedback can be informal or formal. Informal feedback could be a comment from a child, a parent, or a colleague during or after the activity. For example, a child might say "I don’t like this glue. It doesn’t stick". From that feedback you would know that you need to improve the quality of the glue next time. A parent might say "I don’t know what you did but all of a sudden James can button his shirt. Thank you!" From that feedback you would know that the activities you’ve been doing to develop fine motor skills and putting on clothes are successful. A colleague might say "Your group was so excited at snack time. They couldn’t stop talking about what fun they had in the music ring. You must share that activity with me."

Formal feedback would be a planned event. For example, you might ask a colleague to sit in on an activity and make observations and then give you feedback. Or you could work with parents to address a specific need of their child and have a feedback discussion about whether the activities are helping or not.

Often formal feedback from a colleague is given by using a feedback form. Your colleague will complete the form while she is observing your activity and then use it as the basis to give you feedback afterwards.

The table below gives an example of a feedback form.

critical thinking and problem solving skills used in literacy program in ecd

  As you can see, this kind of observation is much more open-ended than the checklist you developed for observation. But your colleague can also use the checklist as a guide to the sorts of things for which to look. As with the checklist, your colleague needs to provide evidence and examples in the comments that she writes on the feedback form.

Your colleague will use this feedback form as the basis for the feedback session you will have. Usually colleagues evaluate one another so you will have opportunities to reinforce positive behaviours and strengths as much as looking at areas where improvement can be made. When you are doing a peer observation, you need to give as much attention to the evidence for effective performance as for ineffective performance. You need to give both affirmative and developmental feedback. Affirmative feedback tells your colleague what she did well. Its purpose is to encourage the person and to reinforce their behaviour. Developmental feedback tells your colleague what needs to be done better and how to do it. Its purpose is to help the person see how she could do better next time. The key to successful feedback is to give the person a manageable amount to go away with and put into practice.

When you evaluate an activity you will always use observation and self-reflection but you should also try to use feedback from others as well. This will give you a more balanced way to evaluate your activities. You may be too judgemental of your own activities or you may be unable to see any problems. Another person’s observations and feedback will help you to see the activity from another perspective.

4.2 Reflect to identify strengths and weaknesses of the programme

As an ECD practitioner, you need to know how to evaluate the activities you design. You must know how to reveal the strengths and weaknesses of your activities. The purpose of finding out what works and what doesn’t is so you can improve and extend the activities you have designed. When you evaluate your activities you need to do so in a consistent and systematic way. You need to get feedback from a variety of relevant sources such as your colleagues, the children in your playgroup and their parents. You also need to do self-reflection and record the findings of your evaluation.

  4.2.1 Identify the strengths and weaknesses of the activities

A strengths and weaknesses table is a useful way to organise anecdotal feedback comments (that means, comments based on individual’s stories and comments). Use this table to do revision of your programme. Pay particular attention to recurring feedback, for example when you hear similar suggestions from the parents or from several different stakeholders. You may like to call on your mentor or a group of peers when you do programme revisions, to make sure you retain the strengths of the programme so that children feel safe and secure.

The following table is an example of an ECD”s strengths and weaknesses:  

critical thinking and problem solving skills used in literacy program in ecd

How does the programme contribute meaningfully to the overall aims of the ECD service?

What are the overall aims of your ECD service? Usually these aims are stated in an ECD centre’s vision and mission. For example, let’s examine the aims of a typical ECD service. The ECD service below lists six main aims. It states that by participating in the ECD learning programme, children will:

  • Develop confidence and self-reliance in themselves as learners
  • Demonstrate curiosity and enjoy learning
  • Develop the ability to focus their attention and complete structured activities
  • Develop a level of communicative competence that is personally satisfying
  • Acquire social skills and abilities which enable them to relate to other children and to adults
  • Remain true to their individual natures, being free to develop to their own potentials.

This is a very useful set of overall aims. For you as an ECD practitioner, a set of aims like this can help you reflect and adjust your learning programme effectively. For example, you could use a rubric to help you check if your learning programme is matching the identified aims.

The following table is an example of a rubric to check how programme matches its stated aims:

critical thinking and problem solving skills used in literacy program in ecd

4.3 Reflect to identify the extent to which the programme contributes meaningfully to the overall aims of the ECD service

Once you have made or adapted and used a resource, you need to reflect on how effective it was in achieving its purpose and whether any improvements or changes are needed. There are various steps in reflection that we will discuss.

  Why reflect? Reflection is about examining and reviewing a product or process. It is defined as: “ to think, ponder, or meditate”.   We need to reflect on our resources in order to:

  • Ensure that the resource supported the activity adequately and did not distract from the planned learning outcomes
  • Identify whether it was useful, effective and appropriate for the activity and the developmental needs and interests of the children
  • Identify its suitability in terms of an ECD context and learning programme
  • Look at possible improvements as regards its safety, durability, bias and ability to meet any special needs of learners

We do not always willingly reflect on our work – we normally only do so when it is required of us and somehow feel that we are on the defensive. Unless we are honest in our reflections, we will never be able to improve on our efforts or the resources we have provided. There is a difference between being overly critical and being reflective. When we reflect we do so because we want to grow and learn.  

As an ECD practitioner, you will need to develop these skills and reflect on your practice so that you can develop yourself and your facilitation skills. You have to challenge yourself to become more creative and to grow. As you grow, so the children in your care will benefit and you will find that dealing with the challenges of each day in a school become easier.

Instead of seeing reflections and evaluations as a burden – see them as an opportunity.

Remember, “ Attitude determines altitude ”.  

4.3.1 Reflect on the extent to which the designed activities contribute meaningfully to the overall aims of the ECD service

As you know, when you reflect on something, you look at it carefully and think about it critically. Reflection is about examining and reviewing. When you are busy facilitating an activity you tend to make quick decisions to deal with any issues that arise. Reflection allows you the luxury of time to examine these decisions at your leisure and make further decisions about how you wish to respond should similar circumstances arise again. These decisions then become part of your activity when you do it again. Observation is done during an activity but reflection is done after an activity.

After doing an activity, it is important to stop and reflect on the activity. This is not so that you can indulge in self-congratulation or regrets, but rather so that you may have a basis for your own learning by reflecting on experience: this activity was unsatisfactory, what could I have done to improve it? Or: this activity was good, what was it exactly that made it good?

When you reflect you can ask yourself: "Did the activity go according to plan?" Although an activity that went according to plan will probably be effective, you also need to ask yourself whether the plan was a good one in the first place. A sensitive and flexible ECD practitioner will plan with different needs in mind and adapt to various changing circumstances such as the needs of the children.

You should try to reflect on an activity as soon after you have done the activity as possible. This is so your ideas and observations are still fresh in your mind. Your reflection notes don’t need to be long but you do need to write your thoughts down. That is why there is a section for reflection at the end of each Activity Plan.

Like observation, reflection is part of an ongoing learning cycle. In this learning cycle, you plan an activity and then you do the activity. After doing the activity you reflect on the activity and use that reflection to again direct your next action. That action could be to improve your activity, try something different or use something similar for a different purpose. Then you are back into planning and the cycle continues.  

critical thinking and problem solving skills used in literacy program in ecd

4.4 Identify and note ways to improve upon the programme for future plans and programmes

As an ECD practitioner, you need to make sure that you evaluate your programme activities regularly. The best way to do this is to structure evaluation sessions and stick to them! Most effective ECD practitioners use an evaluation schedule like this:

ECD Evaluation schedule  

a.  Daily evaluation Write up feedback comments on Activity Plans and daily programme during the day. At the end of the day, take ten minutes to reflect and make simple adjustments.

b.  Weekly evaluation At the end of the week, take 15 minutes to reflect on the weekly programme. Notice any problem areas you experienced. Also, check the feedback book. Make any revisions to the following week’s programme.

  c.  End of term evaluation Meet with other ECD practitioners. Review the programme together. Check that the learning programme for different playgroups link well together. Ask questions like the ones in the checklist that follows. Discuss and implement improvements strategies.

Making an effort to improve your ECD service How can you ensure that you provide a consistently good quality ECD service? You always have to make efforts to improve the quality of the service you offer. You can do this by identifying the problems and weaknesses in your ECD service. You can also build on your strengths and what you already have in place in your ECD service. Remember to work co-operatively with co-workers, families and the community. These groups know your ECD service well. They will have many ideas for ways to improve. They may also notice problems and weaknesses that you have overlooked.

  Suggested ways to improve the quality of your ECD service

  • Do you recognise the individual needs of the child?
  • Are you accountable to the community and caregivers?
  • Do you provide experiences that challenge children, and are achievable?
  • Are problem-solving and critical thinking an integral part of the programme?
  • Do you provide opportunities for the child to develop in a holistic way?
  • Do you avoid stereotyping and bias?
  • Do you encourage integrated learning?
  • Are there many opportunities for experiential learning?
  • Do you insist on regular assessment and evaluation?
  • Do you encourage free play and unguided activities?
  • Do you plan in advance?
  • Do you spend enough time playing with the children?
  • Do you meet regularly as staff to discuss the children’s progress as well as the standard of care and education?

Now take a look at these opportunities:

1. Do you recognise the individual needs of the child? All children are different and need to be treated as special. They acquire knowledge and skills at different developmental stages. As an ECD practitioner, you must take this into consideration when you plan your ECD programme. The mode of learning will not be the same for each child.

2 .  Are you accountable to the community and caregivers? As a practitioner you need to be transparent and available to those who entrust their children to your care. The parent and caregiver community should be kept informed of and involved with what is happening in the playschool.  We can no longer be practitioners in isolation. We need to work with the community and share expertise with colleagues. It is up to you to be well-informed and knowledgeable of ongoing developments in early childhood development.

3.   Do you provide experiences that challenge children and are achievable? As an ECD practitioner you take learners from the known to the unknown. Small children have had varied experiences. Which provide you with a starting point? Try not to under- or over estimate what a young child is capable of learning. Provide a stimulating learning environment where children can be challenged to reach their full potential. If you know your learners well, it will be easier for you to provide an enriching programme.  

4.   Are problem-solving and critical thinking an integral part of the programme? Problem solving is a process of thinking, identifying and finding solutions to everyday situations. This is a life skill that helps children to feel independent and builds their self-esteem. They learn that they have the resources to deal with situations that need answers. This realisation empowers the child to try out possibilities in a supportive environment. It implies that you should avoid stepping in before a child has had the chance to try something for him/herself. You should never interfere to the extent that you stunt (inhibit) their experiential learning.

5.   Do you allow the child to develop in a holistic way? The child should develop emotionally, intellectually, physically, socially, spiritually and creatively. Such holistic development can be achieved through interactive play, storytelling, listening to music, discussing feelings, drama and role play. Children also need to be exposed to an assortment of resources that they explore in a variety of ways.

6. Do you avoid stereotyping and bias? You can encourage children to respect and value all human beings by helping them to understand and celebrate our different cultures, traditions, social customs likes and dislikes. You, as the ECD practitioner, need to have an unbiased approach to gender, race, language, physical ability and children’s special needs. You also need to be aware of other forms of bias that might influence your teaching.

Remember, the children will see you as their role model when they have to deal with their peers.

7.   Do you encourage integrated learning? Children learn best in an integrated environment rather than in isolation. The curriculum should combine all learning areas where possible and avoid fragmentation (breaking up the content into bits and pieces). For example, if "water" is the topic for the week, you could incorporate a story about water (e.g. the water cycle), measuring by using containers of different sizes and shapes, a song about water, uses of water in the home, experiments with water, stories about water the possibilities are endless. Literacy, numeracy and life skills as explained in the NCS documents can be focused on individual skills. However, they are all interrelated and occur throughout the day in the ECD environment.

8.   Are there many opportunities for experiential learning? A hands-on approach is worthwhile and valuable. Children learn best when they do and see, rather than from just being told. Active learning is far more effective than passive rote learning and memorising. You can promote active learning by using different questioning techniques, experimentation and wide variety of different learning materials.  

9. Do you insist on regular assessment and evaluation? Plan your programme to include continuous assessment and evaluation. Ongoing observation and reviewing of the learners” skills and abilities is an integral part of the ECD practitioner’s day. This helps with the learning process and assists the practitioner when planning her or his programme. The concept of continuous assessment and evaluation is one of the philosophical pillars on which outcomes-based education is based. Do keep in mind that testing can be very stressful for young children, and that the results are not always reliable, because children also have good and bad days.

10.   Do you encourage free play and unguided activities? Informal play and games are very important for the child’s development because they promote curiosity, problem solving and co-operative learning skills. Not all activities should be guided. Children should be encouraged to experiment, discover and invent their own activities and rules.

11. Do you spend enough time playing with the children? Play is the most important activity in the lives of children. Sometimes it seems more important than eating and sleeping. This can be easy and fun and also involve trying hard to do something right.

Play is the work, the occupation of childhood.

Why is play important?

  • Play is important because it helps children grow strong and healthy
  • Play is important because children can learn about the meaning of things in the world
  • Play is important because it helps children learn about people
  • Play is important because it helps children learn and grow in a way that helps them feel good about themselves
  • Play is important because it is practice for being grown-up

You as an ECD practitioner do not work in isolation. Meeting with one another to discuss problems will always improve conditions for the children. This not only supplies support to one another but allows you to help less experienced staff members to deal with issues and to maintain quality.

  4.4.1 Identify and record useful ways to improve upon and extend the activities for further use

If an activity does not work, it doesn’t mean you are a "bad" teacher. You are only a "bad" teacher if you don’t reflect on your activities and make the effort to revise the elements of the activity that did not work. The whole point of doing evaluations is for you to improve and extend your teaching practice. A good evaluation should highlight both the strengths and weaknesses of an activity. There are different ways to use evaluations to improve and extend activities. When you evaluate the strengths of an activity you are asking yourself questions such as "What worked?" "Why did it work?" The point is to learn what you are good at. Perhaps you can use this information to improve something else that didn’t work. Or you can extend that which worked for one activity when you design another activity.

When you evaluate the weaknesses of an activity you are asking yourself questions such as "What problems arose?" "Why did they arise?" "How did I deal with them?" The purpose of this questioning is not to focus on the negative. The point is to learn from what has happened. You can turn problems and weaknesses into opportunities to improve and extend your teaching practice.

A good evaluation probes beneath the surface of issues and does not quickly assume that the source of an issue has been located. This involves a willingness to keep asking "why?" and to consider alternative explanations. For example, was your playgroup "unresponsive" to an activity because they were bored or because it was too difficult or was it the specific time of day and they were too tired? You need to keep digging until you find a reasonable explanation for an issue.

When you discussed doing observations and giving feedback, you were encouraged to be specific. This will help you when you want to make changes to your activities. You can look at an activity and ask yourself questions about each aspect such as:

  • What could I have done in my design to avoid problems?
  • If I do this activity again, which specific areas need to be improved?
  • How can I improve those specific areas?
  • What extension activities can I do to help the children with needs and issues that arose from the activity?
  • What follow-up activities can I do to consolidate the skills the children developed in this activity?

  A good evaluation will consider alternative approaches, which could be adopted in future activities. Your adoption of these approaches will be firmly based on the evidence from the evaluation of the activity. It will not just be a case of randomly trying something different. Teaching is a profession that requires constant introspection (looking at yourself and your teaching style) for serious growth and development to take place.

When you have decided how to improve and extend your activities, you need to record your decisions. The Activity Plan that you used to describe your activities has a section for evaluation where you can record your decisions. This section will be a summary of all the feedback, observations and reflections that make up your evaluation as well as the decisions you have made to improve and extend your activity based on that evaluation. You need to record these decisions to help you when you want to do the activity again or when you design another activity. You also need to record the decisions to help any other ECD practitioner who wants to use your Activity Plan. Keep your evaluations in the file with your Activity Plans. This file is an important resource to help you grow and develop as an ECD practitioner.

critical thinking and problem solving skills used in literacy program in ecd

Summative Assessment

You are required to complete a number of summative assessment activities in your Learner Portfolio of Evidence Guide. The Learner Portfolio of Evidence Guide will guide you as to what you are required to do:

  • Complete all the required administration documents and submit all the required documentation, such as a certified copy of your ID, a copy of your CV and relevant certificates of achievement:
  • Learner personal information form
  • Pre-assessment preparation sheet
  • Assessment plan document
  • Declaration of authenticity form
  • Appeals procedure declaration form
  • Place your complete learner workbook (with the completed class activities) in the specified place in the learner PoE guide
  • Complete the knowledge questions under the guidance of your facilitator:

critical thinking and problem solving skills used in literacy program in ecd

  • Complete the other summative assessment activities in your workplace:

critical thinking and problem solving skills used in literacy program in ecd

Once you have completed all the summative activities in your Learner PoE guide, complete the assessment activities checklist to ensure that you have submitted all the required evidence for your portfolio, before submitting your portfolio for assessment.

  • Course sections

Stay in touch

  • http://mgslg.co.za/
  • +27 (11) 830 0768 or +27 (11) 830 2201

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 11 January 2023

The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature

  • Enwei Xu   ORCID: orcid.org/0000-0001-6424-8169 1 ,
  • Wei Wang 1 &
  • Qingxia Wang 1  

Humanities and Social Sciences Communications volume  10 , Article number:  16 ( 2023 ) Cite this article

9899 Accesses

7 Citations

3 Altmetric

Metrics details

  • Science, technology and society

Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field of education as well as a key competence for learners in the 21st century. However, the effectiveness of collaborative problem-solving in promoting students’ critical thinking remains uncertain. This current research presents the major findings of a meta-analysis of 36 pieces of the literature revealed in worldwide educational periodicals during the 21st century to identify the effectiveness of collaborative problem-solving in promoting students’ critical thinking and to determine, based on evidence, whether and to what extent collaborative problem solving can result in a rise or decrease in critical thinking. The findings show that (1) collaborative problem solving is an effective teaching approach to foster students’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]); (2) in respect to the dimensions of critical thinking, collaborative problem solving can significantly and successfully enhance students’ attitudinal tendencies (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI[0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI[0.58, 0.82]); and (3) the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have an impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. On the basis of these results, recommendations are made for further study and instruction to better support students’ critical thinking in the context of collaborative problem-solving.

Introduction

Although critical thinking has a long history in research, the concept of critical thinking, which is regarded as an essential competence for learners in the 21st century, has recently attracted more attention from researchers and teaching practitioners (National Research Council, 2012 ). Critical thinking should be the core of curriculum reform based on key competencies in the field of education (Peng and Deng, 2017 ) because students with critical thinking can not only understand the meaning of knowledge but also effectively solve practical problems in real life even after knowledge is forgotten (Kek and Huijser, 2011 ). The definition of critical thinking is not universal (Ennis, 1989 ; Castle, 2009 ; Niu et al., 2013 ). In general, the definition of critical thinking is a self-aware and self-regulated thought process (Facione, 1990 ; Niu et al., 2013 ). It refers to the cognitive skills needed to interpret, analyze, synthesize, reason, and evaluate information as well as the attitudinal tendency to apply these abilities (Halpern, 2001 ). The view that critical thinking can be taught and learned through curriculum teaching has been widely supported by many researchers (e.g., Kuncel, 2011 ; Leng and Lu, 2020 ), leading to educators’ efforts to foster it among students. In the field of teaching practice, there are three types of courses for teaching critical thinking (Ennis, 1989 ). The first is an independent curriculum in which critical thinking is taught and cultivated without involving the knowledge of specific disciplines; the second is an integrated curriculum in which critical thinking is integrated into the teaching of other disciplines as a clear teaching goal; and the third is a mixed curriculum in which critical thinking is taught in parallel to the teaching of other disciplines for mixed teaching training. Furthermore, numerous measuring tools have been developed by researchers and educators to measure critical thinking in the context of teaching practice. These include standardized measurement tools, such as WGCTA, CCTST, CCTT, and CCTDI, which have been verified by repeated experiments and are considered effective and reliable by international scholars (Facione and Facione, 1992 ). In short, descriptions of critical thinking, including its two dimensions of attitudinal tendency and cognitive skills, different types of teaching courses, and standardized measurement tools provide a complex normative framework for understanding, teaching, and evaluating critical thinking.

Cultivating critical thinking in curriculum teaching can start with a problem, and one of the most popular critical thinking instructional approaches is problem-based learning (Liu et al., 2020 ). Duch et al. ( 2001 ) noted that problem-based learning in group collaboration is progressive active learning, which can improve students’ critical thinking and problem-solving skills. Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses problems with poor structure in real-world situations as the starting point for the learning process (Liang et al., 2017 ). Students learn the knowledge needed to solve problems in a collaborative group, reach a consensus on problems in the field, and form solutions through social cooperation methods, such as dialogue, interpretation, questioning, debate, negotiation, and reflection, thus promoting the development of learners’ domain knowledge and critical thinking (Cindy, 2004 ; Liang et al., 2017 ).

Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of the empirical literature on critical thinking from various perspectives. However, little attention has been paid to the impact of collaborative problem-solving on critical thinking. Therefore, the best approach for developing and enhancing critical thinking throughout collaborative problem-solving is to examine how to implement critical thinking instruction; however, this issue is still unexplored, which means that many teachers are incapable of better instructing critical thinking (Leng and Lu, 2020 ; Niu et al., 2013 ). For example, Huber ( 2016 ) provided the meta-analysis findings of 71 publications on gaining critical thinking over various time frames in college with the aim of determining whether critical thinking was truly teachable. These authors found that learners significantly improve their critical thinking while in college and that critical thinking differs with factors such as teaching strategies, intervention duration, subject area, and teaching type. The usefulness of collaborative problem-solving in fostering students’ critical thinking, however, was not determined by this study, nor did it reveal whether there existed significant variations among the different elements. A meta-analysis of 31 pieces of educational literature was conducted by Liu et al. ( 2020 ) to assess the impact of problem-solving on college students’ critical thinking. These authors found that problem-solving could promote the development of critical thinking among college students and proposed establishing a reasonable group structure for problem-solving in a follow-up study to improve students’ critical thinking. Additionally, previous empirical studies have reached inconclusive and even contradictory conclusions about whether and to what extent collaborative problem-solving increases or decreases critical thinking levels. As an illustration, Yang et al. ( 2008 ) carried out an experiment on the integrated curriculum teaching of college students based on a web bulletin board with the goal of fostering participants’ critical thinking in the context of collaborative problem-solving. These authors’ research revealed that through sharing, debating, examining, and reflecting on various experiences and ideas, collaborative problem-solving can considerably enhance students’ critical thinking in real-life problem situations. In contrast, collaborative problem-solving had a positive impact on learners’ interaction and could improve learning interest and motivation but could not significantly improve students’ critical thinking when compared to traditional classroom teaching, according to research by Naber and Wyatt ( 2014 ) and Sendag and Odabasi ( 2009 ) on undergraduate and high school students, respectively.

The above studies show that there is inconsistency regarding the effectiveness of collaborative problem-solving in promoting students’ critical thinking. Therefore, it is essential to conduct a thorough and trustworthy review to detect and decide whether and to what degree collaborative problem-solving can result in a rise or decrease in critical thinking. Meta-analysis is a quantitative analysis approach that is utilized to examine quantitative data from various separate studies that are all focused on the same research topic. This approach characterizes the effectiveness of its impact by averaging the effect sizes of numerous qualitative studies in an effort to reduce the uncertainty brought on by independent research and produce more conclusive findings (Lipsey and Wilson, 2001 ).

This paper used a meta-analytic approach and carried out a meta-analysis to examine the effectiveness of collaborative problem-solving in promoting students’ critical thinking in order to make a contribution to both research and practice. The following research questions were addressed by this meta-analysis:

What is the overall effect size of collaborative problem-solving in promoting students’ critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills)?

How are the disparities between the study conclusions impacted by various moderating variables if the impacts of various experimental designs in the included studies are heterogeneous?

This research followed the strict procedures (e.g., database searching, identification, screening, eligibility, merging, duplicate removal, and analysis of included studies) of Cooper’s ( 2010 ) proposed meta-analysis approach for examining quantitative data from various separate studies that are all focused on the same research topic. The relevant empirical research that appeared in worldwide educational periodicals within the 21st century was subjected to this meta-analysis using Rev-Man 5.4. The consistency of the data extracted separately by two researchers was tested using Cohen’s kappa coefficient, and a publication bias test and a heterogeneity test were run on the sample data to ascertain the quality of this meta-analysis.

Data sources and search strategies

There were three stages to the data collection process for this meta-analysis, as shown in Fig. 1 , which shows the number of articles included and eliminated during the selection process based on the statement and study eligibility criteria.

figure 1

This flowchart shows the number of records identified, included and excluded in the article.

First, the databases used to systematically search for relevant articles were the journal papers of the Web of Science Core Collection and the Chinese Core source journal, as well as the Chinese Social Science Citation Index (CSSCI) source journal papers included in CNKI. These databases were selected because they are credible platforms that are sources of scholarly and peer-reviewed information with advanced search tools and contain literature relevant to the subject of our topic from reliable researchers and experts. The search string with the Boolean operator used in the Web of Science was “TS = (((“critical thinking” or “ct” and “pretest” or “posttest”) or (“critical thinking” or “ct” and “control group” or “quasi experiment” or “experiment”)) and (“collaboration” or “collaborative learning” or “CSCL”) and (“problem solving” or “problem-based learning” or “PBL”))”. The research area was “Education Educational Research”, and the search period was “January 1, 2000, to December 30, 2021”. A total of 412 papers were obtained. The search string with the Boolean operator used in the CNKI was “SU = (‘critical thinking’*‘collaboration’ + ‘critical thinking’*‘collaborative learning’ + ‘critical thinking’*‘CSCL’ + ‘critical thinking’*‘problem solving’ + ‘critical thinking’*‘problem-based learning’ + ‘critical thinking’*‘PBL’ + ‘critical thinking’*‘problem oriented’) AND FT = (‘experiment’ + ‘quasi experiment’ + ‘pretest’ + ‘posttest’ + ‘empirical study’)” (translated into Chinese when searching). A total of 56 studies were found throughout the search period of “January 2000 to December 2021”. From the databases, all duplicates and retractions were eliminated before exporting the references into Endnote, a program for managing bibliographic references. In all, 466 studies were found.

Second, the studies that matched the inclusion and exclusion criteria for the meta-analysis were chosen by two researchers after they had reviewed the abstracts and titles of the gathered articles, yielding a total of 126 studies.

Third, two researchers thoroughly reviewed each included article’s whole text in accordance with the inclusion and exclusion criteria. Meanwhile, a snowball search was performed using the references and citations of the included articles to ensure complete coverage of the articles. Ultimately, 36 articles were kept.

Two researchers worked together to carry out this entire process, and a consensus rate of almost 94.7% was reached after discussion and negotiation to clarify any emerging differences.

Eligibility criteria

Since not all the retrieved studies matched the criteria for this meta-analysis, eligibility criteria for both inclusion and exclusion were developed as follows:

The publication language of the included studies was limited to English and Chinese, and the full text could be obtained. Articles that did not meet the publication language and articles not published between 2000 and 2021 were excluded.

The research design of the included studies must be empirical and quantitative studies that can assess the effect of collaborative problem-solving on the development of critical thinking. Articles that could not identify the causal mechanisms by which collaborative problem-solving affects critical thinking, such as review articles and theoretical articles, were excluded.

The research method of the included studies must feature a randomized control experiment or a quasi-experiment, or a natural experiment, which have a higher degree of internal validity with strong experimental designs and can all plausibly provide evidence that critical thinking and collaborative problem-solving are causally related. Articles with non-experimental research methods, such as purely correlational or observational studies, were excluded.

The participants of the included studies were only students in school, including K-12 students and college students. Articles in which the participants were non-school students, such as social workers or adult learners, were excluded.

The research results of the included studies must mention definite signs that may be utilized to gauge critical thinking’s impact (e.g., sample size, mean value, or standard deviation). Articles that lacked specific measurement indicators for critical thinking and could not calculate the effect size were excluded.

Data coding design

In order to perform a meta-analysis, it is necessary to collect the most important information from the articles, codify that information’s properties, and convert descriptive data into quantitative data. Therefore, this study designed a data coding template (see Table 1 ). Ultimately, 16 coding fields were retained.

The designed data-coding template consisted of three pieces of information. Basic information about the papers was included in the descriptive information: the publishing year, author, serial number, and title of the paper.

The variable information for the experimental design had three variables: the independent variable (instruction method), the dependent variable (critical thinking), and the moderating variable (learning stage, teaching type, intervention duration, learning scaffold, group size, measuring tool, and subject area). Depending on the topic of this study, the intervention strategy, as the independent variable, was coded into collaborative and non-collaborative problem-solving. The dependent variable, critical thinking, was coded as a cognitive skill and an attitudinal tendency. And seven moderating variables were created by grouping and combining the experimental design variables discovered within the 36 studies (see Table 1 ), where learning stages were encoded as higher education, high school, middle school, and primary school or lower; teaching types were encoded as mixed courses, integrated courses, and independent courses; intervention durations were encoded as 0–1 weeks, 1–4 weeks, 4–12 weeks, and more than 12 weeks; group sizes were encoded as 2–3 persons, 4–6 persons, 7–10 persons, and more than 10 persons; learning scaffolds were encoded as teacher-supported learning scaffold, technique-supported learning scaffold, and resource-supported learning scaffold; measuring tools were encoded as standardized measurement tools (e.g., WGCTA, CCTT, CCTST, and CCTDI) and self-adapting measurement tools (e.g., modified or made by researchers); and subject areas were encoded according to the specific subjects used in the 36 included studies.

The data information contained three metrics for measuring critical thinking: sample size, average value, and standard deviation. It is vital to remember that studies with various experimental designs frequently adopt various formulas to determine the effect size. And this paper used Morris’ proposed standardized mean difference (SMD) calculation formula ( 2008 , p. 369; see Supplementary Table S3 ).

Procedure for extracting and coding data

According to the data coding template (see Table 1 ), the 36 papers’ information was retrieved by two researchers, who then entered them into Excel (see Supplementary Table S1 ). The results of each study were extracted separately in the data extraction procedure if an article contained numerous studies on critical thinking, or if a study assessed different critical thinking dimensions. For instance, Tiwari et al. ( 2010 ) used four time points, which were viewed as numerous different studies, to examine the outcomes of critical thinking, and Chen ( 2013 ) included the two outcome variables of attitudinal tendency and cognitive skills, which were regarded as two studies. After discussion and negotiation during data extraction, the two researchers’ consistency test coefficients were roughly 93.27%. Supplementary Table S2 details the key characteristics of the 36 included articles with 79 effect quantities, including descriptive information (e.g., the publishing year, author, serial number, and title of the paper), variable information (e.g., independent variables, dependent variables, and moderating variables), and data information (e.g., mean values, standard deviations, and sample size). Following that, testing for publication bias and heterogeneity was done on the sample data using the Rev-Man 5.4 software, and then the test results were used to conduct a meta-analysis.

Publication bias test

When the sample of studies included in a meta-analysis does not accurately reflect the general status of research on the relevant subject, publication bias is said to be exhibited in this research. The reliability and accuracy of the meta-analysis may be impacted by publication bias. Due to this, the meta-analysis needs to check the sample data for publication bias (Stewart et al., 2006 ). A popular method to check for publication bias is the funnel plot; and it is unlikely that there will be publishing bias when the data are equally dispersed on either side of the average effect size and targeted within the higher region. The data are equally dispersed within the higher portion of the efficient zone, consistent with the funnel plot connected with this analysis (see Fig. 2 ), indicating that publication bias is unlikely in this situation.

figure 2

This funnel plot shows the result of publication bias of 79 effect quantities across 36 studies.

Heterogeneity test

To select the appropriate effect models for the meta-analysis, one might use the results of a heterogeneity test on the data effect sizes. In a meta-analysis, it is common practice to gauge the degree of data heterogeneity using the I 2 value, and I 2  ≥ 50% is typically understood to denote medium-high heterogeneity, which calls for the adoption of a random effect model; if not, a fixed effect model ought to be applied (Lipsey and Wilson, 2001 ). The findings of the heterogeneity test in this paper (see Table 2 ) revealed that I 2 was 86% and displayed significant heterogeneity ( P  < 0.01). To ensure accuracy and reliability, the overall effect size ought to be calculated utilizing the random effect model.

The analysis of the overall effect size

This meta-analysis utilized a random effect model to examine 79 effect quantities from 36 studies after eliminating heterogeneity. In accordance with Cohen’s criterion (Cohen, 1992 ), it is abundantly clear from the analysis results, which are shown in the forest plot of the overall effect (see Fig. 3 ), that the cumulative impact size of cooperative problem-solving is 0.82, which is statistically significant ( z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]), and can encourage learners to practice critical thinking.

figure 3

This forest plot shows the analysis result of the overall effect size across 36 studies.

In addition, this study examined two distinct dimensions of critical thinking to better understand the precise contributions that collaborative problem-solving makes to the growth of critical thinking. The findings (see Table 3 ) indicate that collaborative problem-solving improves cognitive skills (ES = 0.70) and attitudinal tendency (ES = 1.17), with significant intergroup differences (chi 2  = 7.95, P  < 0.01). Although collaborative problem-solving improves both dimensions of critical thinking, it is essential to point out that the improvements in students’ attitudinal tendency are much more pronounced and have a significant comprehensive effect (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]), whereas gains in learners’ cognitive skill are slightly improved and are just above average. (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

The analysis of moderator effect size

The whole forest plot’s 79 effect quantities underwent a two-tailed test, which revealed significant heterogeneity ( I 2  = 86%, z  = 12.78, P  < 0.01), indicating differences between various effect sizes that may have been influenced by moderating factors other than sampling error. Therefore, exploring possible moderating factors that might produce considerable heterogeneity was done using subgroup analysis, such as the learning stage, learning scaffold, teaching type, group size, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, in order to further explore the key factors that influence critical thinking. The findings (see Table 4 ) indicate that various moderating factors have advantageous effects on critical thinking. In this situation, the subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), learning scaffold (chi 2  = 9.03, P  < 0.01), and teaching type (chi 2  = 7.20, P  < 0.05) are all significant moderators that can be applied to support the cultivation of critical thinking. However, since the learning stage and the measuring tools did not significantly differ among intergroup (chi 2  = 3.15, P  = 0.21 > 0.05, and chi 2  = 0.08, P  = 0.78 > 0.05), we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving. These are the precise outcomes, as follows:

Various learning stages influenced critical thinking positively, without significant intergroup differences (chi 2  = 3.15, P  = 0.21 > 0.05). High school was first on the list of effect sizes (ES = 1.36, P  < 0.01), then higher education (ES = 0.78, P  < 0.01), and middle school (ES = 0.73, P  < 0.01). These results show that, despite the learning stage’s beneficial influence on cultivating learners’ critical thinking, we are unable to explain why it is essential for cultivating critical thinking in the context of collaborative problem-solving.

Different teaching types had varying degrees of positive impact on critical thinking, with significant intergroup differences (chi 2  = 7.20, P  < 0.05). The effect size was ranked as follows: mixed courses (ES = 1.34, P  < 0.01), integrated courses (ES = 0.81, P  < 0.01), and independent courses (ES = 0.27, P  < 0.01). These results indicate that the most effective approach to cultivate critical thinking utilizing collaborative problem solving is through the teaching type of mixed courses.

Various intervention durations significantly improved critical thinking, and there were significant intergroup differences (chi 2  = 12.18, P  < 0.01). The effect sizes related to this variable showed a tendency to increase with longer intervention durations. The improvement in critical thinking reached a significant level (ES = 0.85, P  < 0.01) after more than 12 weeks of training. These findings indicate that the intervention duration and critical thinking’s impact are positively correlated, with a longer intervention duration having a greater effect.

Different learning scaffolds influenced critical thinking positively, with significant intergroup differences (chi 2  = 9.03, P  < 0.01). The resource-supported learning scaffold (ES = 0.69, P  < 0.01) acquired a medium-to-higher level of impact, the technique-supported learning scaffold (ES = 0.63, P  < 0.01) also attained a medium-to-higher level of impact, and the teacher-supported learning scaffold (ES = 0.92, P  < 0.01) displayed a high level of significant impact. These results show that the learning scaffold with teacher support has the greatest impact on cultivating critical thinking.

Various group sizes influenced critical thinking positively, and the intergroup differences were statistically significant (chi 2  = 8.77, P  < 0.05). Critical thinking showed a general declining trend with increasing group size. The overall effect size of 2–3 people in this situation was the biggest (ES = 0.99, P  < 0.01), and when the group size was greater than 7 people, the improvement in critical thinking was at the lower-middle level (ES < 0.5, P  < 0.01). These results show that the impact on critical thinking is positively connected with group size, and as group size grows, so does the overall impact.

Various measuring tools influenced critical thinking positively, with significant intergroup differences (chi 2  = 0.08, P  = 0.78 > 0.05). In this situation, the self-adapting measurement tools obtained an upper-medium level of effect (ES = 0.78), whereas the complete effect size of the standardized measurement tools was the largest, achieving a significant level of effect (ES = 0.84, P  < 0.01). These results show that, despite the beneficial influence of the measuring tool on cultivating critical thinking, we are unable to explain why it is crucial in fostering the growth of critical thinking by utilizing the approach of collaborative problem-solving.

Different subject areas had a greater impact on critical thinking, and the intergroup differences were statistically significant (chi 2  = 13.36, P  < 0.05). Mathematics had the greatest overall impact, achieving a significant level of effect (ES = 1.68, P  < 0.01), followed by science (ES = 1.25, P  < 0.01) and medical science (ES = 0.87, P  < 0.01), both of which also achieved a significant level of effect. Programming technology was the least effective (ES = 0.39, P  < 0.01), only having a medium-low degree of effect compared to education (ES = 0.72, P  < 0.01) and other fields (such as language, art, and social sciences) (ES = 0.58, P  < 0.01). These results suggest that scientific fields (e.g., mathematics, science) may be the most effective subject areas for cultivating critical thinking utilizing the approach of collaborative problem-solving.

The effectiveness of collaborative problem solving with regard to teaching critical thinking

According to this meta-analysis, using collaborative problem-solving as an intervention strategy in critical thinking teaching has a considerable amount of impact on cultivating learners’ critical thinking as a whole and has a favorable promotional effect on the two dimensions of critical thinking. According to certain studies, collaborative problem solving, the most frequently used critical thinking teaching strategy in curriculum instruction can considerably enhance students’ critical thinking (e.g., Liang et al., 2017 ; Liu et al., 2020 ; Cindy, 2004 ). This meta-analysis provides convergent data support for the above research views. Thus, the findings of this meta-analysis not only effectively address the first research query regarding the overall effect of cultivating critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills) utilizing the approach of collaborative problem-solving, but also enhance our confidence in cultivating critical thinking by using collaborative problem-solving intervention approach in the context of classroom teaching.

Furthermore, the associated improvements in attitudinal tendency are much stronger, but the corresponding improvements in cognitive skill are only marginally better. According to certain studies, cognitive skill differs from the attitudinal tendency in classroom instruction; the cultivation and development of the former as a key ability is a process of gradual accumulation, while the latter as an attitude is affected by the context of the teaching situation (e.g., a novel and exciting teaching approach, challenging and rewarding tasks) (Halpern, 2001 ; Wei and Hong, 2022 ). Collaborative problem-solving as a teaching approach is exciting and interesting, as well as rewarding and challenging; because it takes the learners as the focus and examines problems with poor structure in real situations, and it can inspire students to fully realize their potential for problem-solving, which will significantly improve their attitudinal tendency toward solving problems (Liu et al., 2020 ). Similar to how collaborative problem-solving influences attitudinal tendency, attitudinal tendency impacts cognitive skill when attempting to solve a problem (Liu et al., 2020 ; Zhang et al., 2022 ), and stronger attitudinal tendencies are associated with improved learning achievement and cognitive ability in students (Sison, 2008 ; Zhang et al., 2022 ). It can be seen that the two specific dimensions of critical thinking as well as critical thinking as a whole are affected by collaborative problem-solving, and this study illuminates the nuanced links between cognitive skills and attitudinal tendencies with regard to these two dimensions of critical thinking. To fully develop students’ capacity for critical thinking, future empirical research should pay closer attention to cognitive skills.

The moderating effects of collaborative problem solving with regard to teaching critical thinking

In order to further explore the key factors that influence critical thinking, exploring possible moderating effects that might produce considerable heterogeneity was done using subgroup analysis. The findings show that the moderating factors, such as the teaching type, learning stage, group size, learning scaffold, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, could all support the cultivation of collaborative problem-solving in critical thinking. Among them, the effect size differences between the learning stage and measuring tool are not significant, which does not explain why these two factors are crucial in supporting the cultivation of critical thinking utilizing the approach of collaborative problem-solving.

In terms of the learning stage, various learning stages influenced critical thinking positively without significant intergroup differences, indicating that we are unable to explain why it is crucial in fostering the growth of critical thinking.

Although high education accounts for 70.89% of all empirical studies performed by researchers, high school may be the appropriate learning stage to foster students’ critical thinking by utilizing the approach of collaborative problem-solving since it has the largest overall effect size. This phenomenon may be related to student’s cognitive development, which needs to be further studied in follow-up research.

With regard to teaching type, mixed course teaching may be the best teaching method to cultivate students’ critical thinking. Relevant studies have shown that in the actual teaching process if students are trained in thinking methods alone, the methods they learn are isolated and divorced from subject knowledge, which is not conducive to their transfer of thinking methods; therefore, if students’ thinking is trained only in subject teaching without systematic method training, it is challenging to apply to real-world circumstances (Ruggiero, 2012 ; Hu and Liu, 2015 ). Teaching critical thinking as mixed course teaching in parallel to other subject teachings can achieve the best effect on learners’ critical thinking, and explicit critical thinking instruction is more effective than less explicit critical thinking instruction (Bensley and Spero, 2014 ).

In terms of the intervention duration, with longer intervention times, the overall effect size shows an upward tendency. Thus, the intervention duration and critical thinking’s impact are positively correlated. Critical thinking, as a key competency for students in the 21st century, is difficult to get a meaningful improvement in a brief intervention duration. Instead, it could be developed over a lengthy period of time through consistent teaching and the progressive accumulation of knowledge (Halpern, 2001 ; Hu and Liu, 2015 ). Therefore, future empirical studies ought to take these restrictions into account throughout a longer period of critical thinking instruction.

With regard to group size, a group size of 2–3 persons has the highest effect size, and the comprehensive effect size decreases with increasing group size in general. This outcome is in line with some research findings; as an example, a group composed of two to four members is most appropriate for collaborative learning (Schellens and Valcke, 2006 ). However, the meta-analysis results also indicate that once the group size exceeds 7 people, small groups cannot produce better interaction and performance than large groups. This may be because the learning scaffolds of technique support, resource support, and teacher support improve the frequency and effectiveness of interaction among group members, and a collaborative group with more members may increase the diversity of views, which is helpful to cultivate critical thinking utilizing the approach of collaborative problem-solving.

With regard to the learning scaffold, the three different kinds of learning scaffolds can all enhance critical thinking. Among them, the teacher-supported learning scaffold has the largest overall effect size, demonstrating the interdependence of effective learning scaffolds and collaborative problem-solving. This outcome is in line with some research findings; as an example, a successful strategy is to encourage learners to collaborate, come up with solutions, and develop critical thinking skills by using learning scaffolds (Reiser, 2004 ; Xu et al., 2022 ); learning scaffolds can lower task complexity and unpleasant feelings while also enticing students to engage in learning activities (Wood et al., 2006 ); learning scaffolds are designed to assist students in using learning approaches more successfully to adapt the collaborative problem-solving process, and the teacher-supported learning scaffolds have the greatest influence on critical thinking in this process because they are more targeted, informative, and timely (Xu et al., 2022 ).

With respect to the measuring tool, despite the fact that standardized measurement tools (such as the WGCTA, CCTT, and CCTST) have been acknowledged as trustworthy and effective by worldwide experts, only 54.43% of the research included in this meta-analysis adopted them for assessment, and the results indicated no intergroup differences. These results suggest that not all teaching circumstances are appropriate for measuring critical thinking using standardized measurement tools. “The measuring tools for measuring thinking ability have limits in assessing learners in educational situations and should be adapted appropriately to accurately assess the changes in learners’ critical thinking.”, according to Simpson and Courtney ( 2002 , p. 91). As a result, in order to more fully and precisely gauge how learners’ critical thinking has evolved, we must properly modify standardized measuring tools based on collaborative problem-solving learning contexts.

With regard to the subject area, the comprehensive effect size of science departments (e.g., mathematics, science, medical science) is larger than that of language arts and social sciences. Some recent international education reforms have noted that critical thinking is a basic part of scientific literacy. Students with scientific literacy can prove the rationality of their judgment according to accurate evidence and reasonable standards when they face challenges or poorly structured problems (Kyndt et al., 2013 ), which makes critical thinking crucial for developing scientific understanding and applying this understanding to practical problem solving for problems related to science, technology, and society (Yore et al., 2007 ).

Suggestions for critical thinking teaching

Other than those stated in the discussion above, the following suggestions are offered for critical thinking instruction utilizing the approach of collaborative problem-solving.

First, teachers should put a special emphasis on the two core elements, which are collaboration and problem-solving, to design real problems based on collaborative situations. This meta-analysis provides evidence to support the view that collaborative problem-solving has a strong synergistic effect on promoting students’ critical thinking. Asking questions about real situations and allowing learners to take part in critical discussions on real problems during class instruction are key ways to teach critical thinking rather than simply reading speculative articles without practice (Mulnix, 2012 ). Furthermore, the improvement of students’ critical thinking is realized through cognitive conflict with other learners in the problem situation (Yang et al., 2008 ). Consequently, it is essential for teachers to put a special emphasis on the two core elements, which are collaboration and problem-solving, and design real problems and encourage students to discuss, negotiate, and argue based on collaborative problem-solving situations.

Second, teachers should design and implement mixed courses to cultivate learners’ critical thinking, utilizing the approach of collaborative problem-solving. Critical thinking can be taught through curriculum instruction (Kuncel, 2011 ; Leng and Lu, 2020 ), with the goal of cultivating learners’ critical thinking for flexible transfer and application in real problem-solving situations. This meta-analysis shows that mixed course teaching has a highly substantial impact on the cultivation and promotion of learners’ critical thinking. Therefore, teachers should design and implement mixed course teaching with real collaborative problem-solving situations in combination with the knowledge content of specific disciplines in conventional teaching, teach methods and strategies of critical thinking based on poorly structured problems to help students master critical thinking, and provide practical activities in which students can interact with each other to develop knowledge construction and critical thinking utilizing the approach of collaborative problem-solving.

Third, teachers should be more trained in critical thinking, particularly preservice teachers, and they also should be conscious of the ways in which teachers’ support for learning scaffolds can promote critical thinking. The learning scaffold supported by teachers had the greatest impact on learners’ critical thinking, in addition to being more directive, targeted, and timely (Wood et al., 2006 ). Critical thinking can only be effectively taught when teachers recognize the significance of critical thinking for students’ growth and use the proper approaches while designing instructional activities (Forawi, 2016 ). Therefore, with the intention of enabling teachers to create learning scaffolds to cultivate learners’ critical thinking utilizing the approach of collaborative problem solving, it is essential to concentrate on the teacher-supported learning scaffolds and enhance the instruction for teaching critical thinking to teachers, especially preservice teachers.

Implications and limitations

There are certain limitations in this meta-analysis, but future research can correct them. First, the search languages were restricted to English and Chinese, so it is possible that pertinent studies that were written in other languages were overlooked, resulting in an inadequate number of articles for review. Second, these data provided by the included studies are partially missing, such as whether teachers were trained in the theory and practice of critical thinking, the average age and gender of learners, and the differences in critical thinking among learners of various ages and genders. Third, as is typical for review articles, more studies were released while this meta-analysis was being done; therefore, it had a time limit. With the development of relevant research, future studies focusing on these issues are highly relevant and needed.

Conclusions

The subject of the magnitude of collaborative problem-solving’s impact on fostering students’ critical thinking, which received scant attention from other studies, was successfully addressed by this study. The question of the effectiveness of collaborative problem-solving in promoting students’ critical thinking was addressed in this study, which addressed a topic that had gotten little attention in earlier research. The following conclusions can be made:

Regarding the results obtained, collaborative problem solving is an effective teaching approach to foster learners’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]). With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students’ attitudinal tendency, and the comprehensive effect is significant (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

As demonstrated by both the results and the discussion, there are varying degrees of beneficial effects on students’ critical thinking from all seven moderating factors, which were found across 36 studies. In this context, the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have a positive impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. Since the learning stage (chi 2  = 3.15, P  = 0.21 > 0.05) and measuring tools (chi 2  = 0.08, P  = 0.78 > 0.05) did not demonstrate any significant intergroup differences, we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving.

Data availability

All data generated or analyzed during this study are included within the article and its supplementary information files, and the supplementary information files are available in the Dataverse repository: https://doi.org/10.7910/DVN/IPFJO6 .

Bensley DA, Spero RA (2014) Improving critical thinking skills and meta-cognitive monitoring through direct infusion. Think Skills Creat 12:55–68. https://doi.org/10.1016/j.tsc.2014.02.001

Article   Google Scholar  

Castle A (2009) Defining and assessing critical thinking skills for student radiographers. Radiography 15(1):70–76. https://doi.org/10.1016/j.radi.2007.10.007

Chen XD (2013) An empirical study on the influence of PBL teaching model on critical thinking ability of non-English majors. J PLA Foreign Lang College 36 (04):68–72

Google Scholar  

Cohen A (1992) Antecedents of organizational commitment across occupational groups: a meta-analysis. J Organ Behav. https://doi.org/10.1002/job.4030130602

Cooper H (2010) Research synthesis and meta-analysis: a step-by-step approach, 4th edn. Sage, London, England

Cindy HS (2004) Problem-based learning: what and how do students learn? Educ Psychol Rev 51(1):31–39

Duch BJ, Gron SD, Allen DE (2001) The power of problem-based learning: a practical “how to” for teaching undergraduate courses in any discipline. Stylus Educ Sci 2:190–198

Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189x018003004

Facione PA (1990) Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Eric document reproduction service. https://eric.ed.gov/?id=ed315423

Facione PA, Facione NC (1992) The California Critical Thinking Dispositions Inventory (CCTDI) and the CCTDI test manual. California Academic Press, Millbrae, CA

Forawi SA (2016) Standard-based science education and critical thinking. Think Skills Creat 20:52–62. https://doi.org/10.1016/j.tsc.2016.02.005

Halpern DF (2001) Assessing the effectiveness of critical thinking instruction. J Gen Educ 50(4):270–286. https://doi.org/10.2307/27797889

Hu WP, Liu J (2015) Cultivation of pupils’ thinking ability: a five-year follow-up study. Psychol Behav Res 13(05):648–654. https://doi.org/10.3969/j.issn.1672-0628.2015.05.010

Huber K (2016) Does college teach critical thinking? A meta-analysis. Rev Educ Res 86(2):431–468. https://doi.org/10.3102/0034654315605917

Kek MYCA, Huijser H (2011) The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. High Educ Res Dev 30(3):329–341. https://doi.org/10.1080/07294360.2010.501074

Kuncel NR (2011) Measurement and meaning of critical thinking (Research report for the NRC 21st Century Skills Workshop). National Research Council, Washington, DC

Kyndt E, Raes E, Lismont B, Timmers F, Cascallar E, Dochy F (2013) A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educ Res Rev 10(2):133–149. https://doi.org/10.1016/j.edurev.2013.02.002

Leng J, Lu XX (2020) Is critical thinking really teachable?—A meta-analysis based on 79 experimental or quasi experimental studies. Open Educ Res 26(06):110–118. https://doi.org/10.13966/j.cnki.kfjyyj.2020.06.011

Liang YZ, Zhu K, Zhao CL (2017) An empirical study on the depth of interaction promoted by collaborative problem solving learning activities. J E-educ Res 38(10):87–92. https://doi.org/10.13811/j.cnki.eer.2017.10.014

Lipsey M, Wilson D (2001) Practical meta-analysis. International Educational and Professional, London, pp. 92–160

Liu Z, Wu W, Jiang Q (2020) A study on the influence of problem based learning on college students’ critical thinking-based on a meta-analysis of 31 studies. Explor High Educ 03:43–49

Morris SB (2008) Estimating effect sizes from pretest-posttest-control group designs. Organ Res Methods 11(2):364–386. https://doi.org/10.1177/1094428106291059

Article   ADS   Google Scholar  

Mulnix JW (2012) Thinking critically about critical thinking. Educ Philos Theory 44(5):464–479. https://doi.org/10.1111/j.1469-5812.2010.00673.x

Naber J, Wyatt TH (2014) The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students. Nurse Educ Today 34(1):67–72. https://doi.org/10.1016/j.nedt.2013.04.002

National Research Council (2012) Education for life and work: developing transferable knowledge and skills in the 21st century. The National Academies Press, Washington, DC

Niu L, Behar HLS, Garvan CW (2013) Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 9(12):114–128. https://doi.org/10.1016/j.edurev.2012.12.002

Peng ZM, Deng L (2017) Towards the core of education reform: cultivating critical thinking skills as the core of skills in the 21st century. Res Educ Dev 24:57–63. https://doi.org/10.14121/j.cnki.1008-3855.2017.24.011

Reiser BJ (2004) Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J Learn Sci 13(3):273–304. https://doi.org/10.1207/s15327809jls1303_2

Ruggiero VR (2012) The art of thinking: a guide to critical and creative thought, 4th edn. Harper Collins College Publishers, New York

Schellens T, Valcke M (2006) Fostering knowledge construction in university students through asynchronous discussion groups. Comput Educ 46(4):349–370. https://doi.org/10.1016/j.compedu.2004.07.010

Sendag S, Odabasi HF (2009) Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 53(1):132–141. https://doi.org/10.1016/j.compedu.2009.01.008

Sison R (2008) Investigating Pair Programming in a Software Engineering Course in an Asian Setting. 2008 15th Asia-Pacific Software Engineering Conference, pp. 325–331. https://doi.org/10.1109/APSEC.2008.61

Simpson E, Courtney M (2002) Critical thinking in nursing education: literature review. Mary Courtney 8(2):89–98

Stewart L, Tierney J, Burdett S (2006) Do systematic reviews based on individual patient data offer a means of circumventing biases associated with trial publications? Publication bias in meta-analysis. John Wiley and Sons Inc, New York, pp. 261–286

Tiwari A, Lai P, So M, Yuen K (2010) A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 40(6):547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x

Wood D, Bruner JS, Ross G (2006) The role of tutoring in problem solving. J Child Psychol Psychiatry 17(2):89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x

Wei T, Hong S (2022) The meaning and realization of teachable critical thinking. Educ Theory Practice 10:51–57

Xu EW, Wang W, Wang QX (2022) A meta-analysis of the effectiveness of programming teaching in promoting K-12 students’ computational thinking. Educ Inf Technol. https://doi.org/10.1007/s10639-022-11445-2

Yang YC, Newby T, Bill R (2008) Facilitating interactions through structured web-based bulletin boards: a quasi-experimental study on promoting learners’ critical thinking skills. Comput Educ 50(4):1572–1585. https://doi.org/10.1016/j.compedu.2007.04.006

Yore LD, Pimm D, Tuan HL (2007) The literacy component of mathematical and scientific literacy. Int J Sci Math Educ 5(4):559–589. https://doi.org/10.1007/s10763-007-9089-4

Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners’ critical thinking in online peer review. Audio Visual Educ Res 6:53–60. https://doi.org/10.13811/j.cnki.eer.2022.06.08

Download references

Acknowledgements

This research was supported by the graduate scientific research and innovation project of Xinjiang Uygur Autonomous Region named “Research on in-depth learning of high school information technology courses for the cultivation of computing thinking” (No. XJ2022G190) and the independent innovation fund project for doctoral students of the College of Educational Science of Xinjiang Normal University named “Research on project-based teaching of high school information technology courses from the perspective of discipline core literacy” (No. XJNUJKYA2003).

Author information

Authors and affiliations.

College of Educational Science, Xinjiang Normal University, 830017, Urumqi, Xinjiang, China

Enwei Xu, Wei Wang & Qingxia Wang

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Enwei Xu or Wei Wang .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary tables, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Xu, E., Wang, W. & Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit Soc Sci Commun 10 , 16 (2023). https://doi.org/10.1057/s41599-023-01508-1

Download citation

Received : 07 August 2022

Accepted : 04 January 2023

Published : 11 January 2023

DOI : https://doi.org/10.1057/s41599-023-01508-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Exploring the effects of digital technology on deep learning: a meta-analysis.

Education and Information Technologies (2024)

Impacts of online collaborative learning on students’ intercultural communication apprehension and intercultural communicative competence

  • Hoa Thi Hoang Chau
  • Hung Phu Bui
  • Quynh Thi Huong Dinh

Education and Information Technologies (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

critical thinking and problem solving skills used in literacy program in ecd

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform

Mission Possible: Measuring Critical Thinking and Problem Solving

Setting the stage, act i: developing rubrics, act ii: creating engaging tasks, act iii: finding the right scoring system, encore: one more "c", driving better instruction.

premium resources logo

Premium Resource

You are a 4th grade student at Smith Elementary School. A local business wants to give your school money to help improve health for all of the students. The money will be used to pay for only one of these projects: An outdoor fitness course at the school or a fruit and salad bar for the lunchroom. Some students want a fitness course and some want a fruit and salad bar. … The school cannot have both. Your principal, Mr. Beach, wants you to help him make a choice.

Atkinson, D. (2017). Virginia rethinks high school in its profile of a graduate. State Education Standard , 17 (2), 28–33. Retrieved from www.nasbe.org/wp-content/uploads/Virginia-Rethinks-High-School-in-Its-Profile-of-a-Graduate_May-2017-Standard.pdf

Arter, J., & McTighe, J. (2001). Scoring rubrics in the classroom: Using performance criteria for assessing and improving student performance . Thousand Oaks, CA: Corwin.

Benson, J. (1981). A redefinition of content validity. Educational and Psychological Measurement , 41 (3), 793–802.

Council for Aid to Education. (2007). College and Work Readiness Assessment [Measurement instrument].

Guskey, T. R., & Jung, L. A. (2016). Grading: Why you should trust your judgment . Educational Leadership , 73 (7), 50–54.

Lane, S., & Iwatani, E. (2016). Design of performance assessments in education. In S. Lane, M.R. Raymond, & T.M. Haladyna (Eds.), Handbook of Test Development (2nd ed., pp. 274–293). New York: Routledge.

Picus, L. O., Adamson, F., Montague, W., & Owens, M. (2010). A new conceptual framework for analyzing the costs of performance assessment . Stanford, CA: Stanford University, Stanford Center for Opportunity Policy in Education. Retrieved from https://scale.stanford.edu/system/files/new-conceptual-framework-analyzing-costs-performance-assessment.pdf

Virginia Beach City Public Schools. (2008). Compass to 2015: A Strategic Plan for Student Success . Retrieved from www.vbschools.com/compass/2015

Virginia General Assembly. (2016). Code of Virginia § 22.1-253.13:4. Retrieved from http://law.lis.virginia.gov/vacode/22.1-253.13:4

Wagner, T. (2008). The global achievement gap: Why even our best schools don't teach the new survival skills our children need—and what we can do about it . New York: Basic Books.

Wiggins, G., & McTighe, J. (2005). Understanding by design (2nd ed.). Alexandria, VA: ASCD.

• 1 This student profile is the foundation of the Virginia Board of Education's redesign efforts to better prepare our students to participate in the global economy (Atkinson, 2017).

critical thinking and problem solving skills used in literacy program in ecd

Author bio coming soon

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action., related articles.

undefined

Giving Retakes Their Best Chance to Improve Learning

undefined

The Way We Talk About Assessment Matters

undefined

The Power of Feedback

undefined

Gathering Feedback from Student Work

undefined

School Leaders: If You Want Feedback, Ask for It

From our issue.

Product cover image 118067b.jpg

To process a transaction with a Purchase Order please send to [email protected]

critical thinking and problem solving skills used in literacy program in ecd

MSU Extension Child & Family Development

The importance of critical thinking for young children.

Kylie Rymanowicz, Michigan State University Extension - May 03, 2016

share this on facebook

Critical thinking is essential life skill. Learn why it is so important and how you can help children learn and practice these skills.

It is important to teach children critical thinking skills.

We use critical thinking skills every day. They help us to make good decisions, understand the consequences of our actions and solve problems. These incredibly important skills are used in everything from putting together puzzles to mapping out the best route to work. It’s the process of using focus and self-control to solve problems and set and follow through on goals. It utilizes other important life skills like making connections , perspective taking and communicating . Basically, critical thinking helps us make good, sound decisions.

Critical thinking

In her book, “Mind in the Making: The seven essential life skills every child needs,” author Ellen Galinsky explains the importance of teaching children critical thinking skills. A child’s natural curiosity helps lay the foundation for critical thinking. Critical thinking requires us to take in information, analyze it and make judgements about it, and that type of active engagement requires imagination and inquisitiveness. As children take in new information, they fill up a library of sorts within their brain. They have to think about how the new information fits in with what they already know, or if it changes any information we already hold to be true.

Supporting the development of critical thinking

Michigan State University Extension has some tips on helping your child learn and practice critical thinking.

  • Encourage pursuits of curiosity . The dreaded “why” phase. Help them form and test theories, experiment and try to understand how the world works. Encourage children to explore, ask questions, test their theories, think critically about results and think about changes they could make or things they could do differently.
  • Learn from others. Help children think more deeply about things by instilling a love for learning and a desire to understand how things work. Seek out the answers to all of your children’s “why” questions using books, the internet, friends, family or other experts.
  • Help children evaluate information. We are often given lots of information at a time, and it is important we evaluate that information to determine if it is true, important and whether or not we should believe it. Help children learn these skills by teaching them to evaluate new information. Have them think about where or who the information is coming from, how it relates to what they already know and why it is or is not important.
  • Promote children’s interests. When children are deeply vested in a topic or pursuit, they are more engaged and willing to experiment. The process of expanding their knowledge brings about a lot of opportunities for critical thinking, so to encourage this action helps your child invest in their interests. Whether it is learning about trucks and vehicles or a keen interest in insects, help your child follow their passion.
  • Teach problem-solving skills. When dealing with problems or conflicts, it is necessary to use critical thinking skills to understand the problem and come up with possible solutions, so teach them the steps of problem-solving and they will use critical thinking in the process of finding solutions to problems.

For more articles on child development, academic success, parenting and life skill development, please visit the MSU Extension website.

This article was published by Michigan State University Extension . For more information, visit https://extension.msu.edu . To have a digest of information delivered straight to your email inbox, visit https://extension.msu.edu/newsletters . To contact an expert in your area, visit https://extension.msu.edu/experts , or call 888-MSUE4MI (888-678-3464).

Did you find this article useful?

Early childhood development resources for early childhood professionals.

new - method size: 3 - Random key: 2, method: personalized - key: 2

You Might Also Be Interested In

critical thinking and problem solving skills used in literacy program in ecd

MSU researcher awarded five-year, $2.5 million grant to develop risk assessment training program

Published on October 13, 2020

critical thinking and problem solving skills used in literacy program in ecd

MSU Product Center helps Michigan food entrepreneurs survive and thrive throughout pandemic

Published on August 31, 2021

critical thinking and problem solving skills used in literacy program in ecd

Protecting Michigan’s environment and wildlife through the Conservation Reserve Enhancement Program

Published on September 1, 2021

critical thinking and problem solving skills used in literacy program in ecd

MSU Extension to undertake three-year, $7 million vaccination education effort

Published on August 17, 2021

critical thinking and problem solving skills used in literacy program in ecd

MSU to study precision livestock farming adoption trends in U.S. swine industry

Published on March 15, 2021

critical thinking and problem solving skills used in literacy program in ecd

MSU research team receives USDA grant to evaluate effectiveness, cost of new blueberry pest management strategies

Published on February 19, 2021

  • approaches to learning
  • child & family development
  • cognition and general knowledge
  • early childhood development
  • life skills
  • msu extension
  • rest time refreshers
  • approaches to learning,
  • child & family development,
  • cognition and general knowledge,
  • early childhood development,
  • life skills,
  • msu extension,

Supporting staff in implementing a new literacy program

Supporting staff in implementing a new literacy program

In Designing a whole-school literacy program , Andrew Nicholls discussed the process he undertook when revamping his school's literacy program. Here, he discusses staff professional development to support it's implementation and the program's impact on student learning outcomes.

After developing a new literacy program at a small rural school, I recognised that, in order for it to achieve success, professional development for staff was critical.

So, I designed PD sessions that aimed to engage staff by communicating:

  • The brief – what the Principal wanted and the school needed;
  • The approach – the data I used and the research I based my conclusions on;
  • The solution – what we were going to do to solve the problem.

The PD sessions

Building on Sir Ken Robinson's Changing Education Paradigms argument of ‘waking [students] up to what is inside themselves', and influenced by Wiggins (1998) and the work of his colleagues in the development of assessment standards, at the beginning of the school year I informed staff that the program would allow the school to generate data that was personal, informative, trackable, accessible and transferable.

I then provided teachers with an overview of the program and explained how each teacher was to be allocated the responsibility to manage their own group's literacy data. Each teacher was then provided a sample assessment and asked to complete it. Each assessment was marked by another staff member and a conversation was had. This provided meaningful discussion about the complexity of the assessments.

To further support the implementation of the literacy program, I provided a comprehensive staff handbook at the end of the PD session. Each handbook contained:

  • The program's yearly calendar with detailed dates of each stage;
  • An abridged overview of each assessment program and how to conduct it;
  • Instructions on how to enter the data generated by each assessment; and,
  • Class allocations.

In Term 2, a second session was held. This session's content was based around feedback from the staff, who were asked to think, share and record their feedback before placing their thoughts on Plus, Minus, and Interesting (PMI) charts.

Overall, the response was overwhelmingly positive. They felt the continuum supported student learning and engagement, their teaching practice, and parent understanding of their child's abilities. They also felt that the literacy program provided a cohesive structure within the school. Notably, some staff noted that their own literacy knowledge had begun to improve.

Impact on student learning outcomes

The immediate benefits of the program for students were apparent, particularly when teachers began to provide students with their individual results. For some, it was a relief to finally understand what might be holding back their ability to grasp knowledge from instruction or activities.

The long-term benefits of the program were evident by the impact it made to NAPLAN results. We had made a positive improvement across all assessments in reading, writing and language conventions (spelling, grammar and punctuation). Students in Year 9 showed improvement, and were either at or above the state's minimum reading standard.

Importantly, data from the Literacy Continuum showed that the majority of students were engaged with literacy, showing ownership over their learning and demonstrating a desire to improve.

The Literacy Continuum began the school's movement towards more frequent, targeted feedback to students. It led to the development of classroom performance feedback, and Student Progress Indicators (SPI). These tracked and fed information back to students about their ability to apply effort in class, submit coursework, complete assessments, and demonstrate preparedness.

As the year progressed, teachers became conscious of their ability to teach, and to some degree, relearn literacy knowledge and skills. At the core of this were the correct use of grammar and the orthography of spelling. As teachers continued to explicitly teach literacy skills, their ability and confidence to assess and improve student abilities in specialist areas of the curriculum also improved.

As literacy improved, so too did student behaviour. In the absence of the regular classroom teacher, disengaged and disruptive students would now sit and silently read. The library saw an increase in the amount of books being borrowed, and students began to use their personal devices to download books and novels.

Lastly, and perhaps most importantly, data became a colloquial term. The school community realised the impact of providing continual feedback to students and how diagnostic assessment could assist in pinpointing each student's Zone of Proximal Development. This not only provided teachers with the ability to provide differentiated and explicit instruction, but ensured they could measure their impact.

For example, the CARS (Comprehensive Assessment of Comprehension) program found that many of the school's Year 7 students had a major inability to inference meaning from text, which made teachers consciously explain meaning when a text only inferred it.

Ultimately, in my opinion, the key to this literacy program revolved around two factors; diagnostic assessment and continual feedback of progress, stagnation or regression. Without the ability to isolate each student's areas of need in terms of spelling orthography, grammar, and reading and comprehension, it would be impossible to target and improve their literacy skills.

It was the Literacy Continuum that supported this - a simple document that collated data, enabled students to engage, take ownership over their learning and celebrate their achievements with teachers, each other and their parents.

Wiggins, G. (1998). Educative Assessment. New York: Jossey-Bass Publishers.

Is there a literacy program at your school? If so, are all teachers part of the program, rather than just literacy teachers?

Do teachers use the data to understand each students' strenths and areas for improvement?

Related articles

Nudging parents to be literacy partners

critical thinking and problem solving skills used in literacy program in ecd

Santa Clarita Valley's #1 Local News Source

Critical thinking & problem-solving skills students need, sponsored post.

  • June 15, 2020

critical thinking and problem solving skills used in literacy program in ecd

Do you have children of school age, and currently looking for a school to enrol them? As parents, there are a multitude of factors which need to be considered when searching for a school suitable for your child’s educational needs. Selecting the appropriate school for your children is essential as the role of your chosen institution will be to nurture and guide your child in their developing years with the skills they need in adult life.

According to experts, some of the skills children should develop early in life are critical thinking and problem-solving skills, which help them form a structured base for decisions they’ll make in their working and personal life. Critical thinking is in problem-solving, creating strategic plans, and understanding the effects of your actions. This article discusses the various critical thinking and problem-solving skills students need to develop and help you find the right school to cultivate these skills into your children from an early age.

Table of Contents

What Are Critical Thinking and Problem-Solving Skills?

Critical thinking skills students should develop, what are the barriers to critical thinking development in students, problem-solving skills that are essential to students, factors to consider when choosing the right school for your kids.

According to Music First Hand, Founder and Chief Executive Officer Kris Potrafka , people who lack critical thinking skills have reduced promotion opportunities and are more susceptible to manipulation and fraud. And, it’s for this very reason, employers value employees highly with problem-solving skills; these essential traits greatly impact employers during the selection process when hiring candidates.

Critical thinking is the mental process of conceptualising, analysing, evaluating, and applying the information to guide one’s action and belief. Information obtained from observation, reflection, experience, learning, communication, and reasoning become the cornerstone of the decisions we make.

Problem-solving , meanwhile, is the process of defining a problem, finding its cause, developing or finding a solution, and applying the solution to solve it. Excellent problem-solving skills are essential tools for career advancement.

Significant responsibility is placed on schools for developing characteristics in students which prepare them for their working roles, the decisions they make and how they interact with the community. But, what exactly are the benefits of critical thinking?  How does critical thinking serve as a safety net from making poor decisions, and what are the essentials elements which make up the critical thinking process?

Below, we summarised the essential elements which help all individual’s draw conclusions, make decisions and take decisive action at the right time, let’s look at those now;

  • Research – The ability to independently conduct research and verify issues or subjects and analyse arguments from different parties. A critical thinker sources information and determines its validity or factualness based on thorough investigation rather than what they are told to believe.
  • Identification –Determine the issue and formulate an understanding of the factors which may affect it. Having a clear picture of the problem allows the critical thinker to take the right approach in resolving it or make determinations on a course of action. Students should learn how to conduct a mental inventory of any new question, scenario, or situation.
  • Inference – To develop their critical thinking skills, students should learn how to infer or make an educated guess based on the collected information. In analysing the available data, students should take the initiative to collect other related information to make better conclusions about an issue, situation, or scenario.
  • Bias identification – The ability to recognise biases is essential in the critical thinking process. As critical thinkers, students should learn the best ways to assess information objectively. They should take into account the biases of the opposing arguments in their evaluation of the presented claims or information.
  • Curiosity – Students should be taught not to accept everything presented to them at face value. To develop their interest productively, they should learn how to ask open-ended questions about the things they observe around them.
  • Relevancy determination – In critical thinking, students should learn how to determine the relevant information they need to resolve an issue. To do this, they should evaluate their end goal and rank the collected data based on their relevance to the objective or problem.

In teaching critical thinking skills to students, teachers must determine the challenges and barriers that impede their progress. By identifying these barriers, teachers can develop strategies to overcome them. Here are some common educational roadblocks and how to avoid them:

  • Intolerance and Arrogance – Certain behavioural traits often prevent critical thinking, compelling some to react carelessly to specific situations and impact on their ability to solve problems. To eliminate intolerance and arrogance, teachers should encourage students to question their way of thinking.
  • Personal Biases – Students of all ages should be mindful to avoid biases which can block their ability to reason, inquisitively, and with an open mind. Educators are encouraged to motivate students to develop logical thinking through homework examples to help them question methods and help eliminate biases.
  • Schedule Issues – Time constraints and teacher workload often limit learning opportunities to develop critical thinking skills amongst students. Teachers are asked to prioritise the creation of essential lessons of thinking and develop methods to model thinking behaviours which enhance the critical thinking skills of their students.
  • Drone Mentality – Young students usually have a drone mentality, in which they have no interest in what is going on around them. Routinary activities in the classroom can lead to this type of attitude and limit the development of critical thinking skills.  To eliminate drone mentality, teachers should place increased focus on developing creative teaching strategies to spark and maintain student interest.
  • Groupthink – An understanding of groupthink and the barrier it represents to the critical thinking process, especially when encouraged from a young age. To break this way of thinking, the student should learn how to become independent. Students should be encouraged to question popular beliefs, thoughts, and opinions.

Educators can help eliminate a groupthink perspective in their students by introducing teaching methods which encourage independent thinking. Students can learn how to develop individual thought and critical thinking through constructive arguments and debating activities.

  • Social Conditioning – Social conditioning is developed through stereotyping and unwanted assumptions. Students are vulnerable to social conditioning, as their critical thinking skills are not yet fully developed. Teaching them to think outside the box at an early age will allow them to avoid social conditioning. Educators should also teach their learners accuracy, fair-mindedness, and clarity in their thinking pattern.
  • Egocentrism – Egocentric thinking is more noticeable in young students as they are curious about themselves and where could they fit in. In egocentric behaviour, individuals have the natural inclination to view all things about themselves. This type of response prevents the development of different perspectives and sympathy for others.

To eliminate egocentric behaviour, teachers should encourage critical thinking activities in the classroom. The educators should assist students in improving their abstract thinking by highlighting the attitudes and opinions of others in social conflict examples. The teachers should develop empathy and understanding of student views and their opinions of others.

In teaching students how to develop their problem-solving skills, teachers should use the theories linked to the psychology of learning. The use of psychology may rouse the curiosity and motivation of students in the learning process.  The Australian Christian College also recently discussed the importance of critical thinking in a recent article, a worthwhile read. Here are some of the problem-solving skills students should develop inside and outside the classroom:

  • Analysis – In problem-solving, the first step is to analyse the issue to formulate possible solutions or strategies to resolve it. Teachers should introduce lessons or methods of teaching aimed at nurturing the analytical skills of their students. An example is the cause and effect analysis.
  • Communication – Students should learn ways to communicate effectively to solve problems or issues successfully. Excellent communication is particularly essential if they are working with a team or an organisation. With proper communication, they can avoid confusion and misunderstandings. They will also learn how to determine the most appropriate communication channels when they need assistance.
  • Active listening – Listening skills are essential components of problem-solving. Through active listening, students can fully understand the problem or issue and respond accordingly. How well they grasp the problem will enable them to ask the right questions and allow them to have a clearer picture of the situation. Active listening will result in the development of better solutions.
  • Teamwork – In solving problems in an organisation or team, the camaraderie and rapport among team members are essential. Therefore, students should learn how to work independently and with their peers. Teachers should use team-building practices to allow their students to establish trust and better relationships among each other.
  • Research – Like in critical thinking, research skills are essential in problem-solving. Students should be able to analyse a problem’s cause and the factors involved to be able to solve it. In gathering facts and other data, they can conduct independent research, brainstorm with their team members, and consult with their teachers.
  • Decision making – As a problem solver, students should learn how to determine the most effective solutions to an issue. In the decision-making process, they should use the data obtained from their research and analysis. In a group setting, they should also learn how to reach a consensus during the decision-making process.
  • Creativity – Students should nurture their creativity in finding solutions to issues. Creative problem solving allows students to find fresh ideas and develop disruptive solutions to problems. Creativity leads to the development of new products, technologies, and processes.
  • Dependability – Creating solutions on time is essential in any organisation. Therefore, students should strive to become dependable by completing their tasks and homework on time. Students who demonstrate dependability are highly regarded by employers.

Academic achievements aside, one primary consideration for most parents is the development of your child’s character, critical thinking and problem-solving skills. Several other factors need also will weigh into your decision making, they are;

  • Academic programs – Choosea school with curricula and services based on a holistic approach to education. Its missions and objectives should include introducing programs aimed at developing the integrity, compassion, resiliency, and critical thinking skills of its students.
  • Educational cost – To make sure you can shoulder the expenses of your kids, enrol them in schools that fit your budget. Take into consideration the financial assistance the school may offer in your decision. You may also apply your children for scholarships, if available.
  • School size – The number of students in a class may affect the learning process of your children. If you can afford it, you may enrol your students in a school with smaller classes to maximise the learning potential of your kids.
  • Location – Before looking for a school, determine whether you can afford to send your kids to faraway places to study. It is practical to enrol your kids in schools near you to save on transportation costs, gas, or boarding fees.
  • Reputation – A school with a reputation for quality education is preferable for your kids. Consider enrolling them in reputable schools to maximise their learning opportunities.
  • Extracurricular or special activities – Out-of-class activities are essential factors in developing the character and personality of your kids. Find schools with proactive extracurricular offerings.
  • Retention and graduation rates – Before enrolling your kids in a school, research on its retention and graduation rates as they are indicators of its quality. If possible, avoid schools with high transfer rates and low graduation rates as they indicate poor quality.
  • School safety – The safety of your children should be one of the primary considerations in your choice of school. Research on a school’s crime statistics and determine its strategies in place to ensure student safety before enrolling your children.

Critical thinking and problem-solving are essential traits in the development of all children.  As parents, actively encouraging your children to engage in discussion about current world events, problems or issues which they are passionate about, whether they be at home, school or amongst friends.  The skills they learn today greatly aids in them in making the right life choices, while increasing their perceived value within the community and to future employers.

Sponsored Post

Related To This Story

Simple ways to turbocharge employee productivity, must-have accessories for a safe and enjoyable boating experience, understanding the importance of spanish grammar rules, unveiling providenciales’ oasis: exploring the shore club in turks and caicos, understanding your water supplier options and finding the best deal, how to improve your position as a trader, latest news.

critical thinking and problem solving skills used in literacy program in ecd

City OK’s taking over Vista Canyon bridge plans

critical thinking and problem solving skills used in literacy program in ecd

Former state DOJ employee takes plea in embezzlement case  

La Mesa Junior High School Principal Kullen Welch made pancakes for students who earned an “A” grade in the previous semester as a reward for their hard work on Thursday morning in the school’s quad. Katherine Quezada/The Signal

Syrup, smiles, and pancake piles: La Mesa celebrates ‘A’ students   

critical thinking and problem solving skills used in literacy program in ecd

Chiquita Canyon hosting virtual meeting Thursday over landfill 

College briefs

College Brief for Feb. 15

Sign up for the, morning rundown.

Filled with the top stories to start your day, and emergency news alerts. 

critical thinking and problem solving skills used in literacy program in ecd

25060 Avenue Stanford, St. 141

Valencia, CA, 91355

Main Desk: 661-259-1234

Newsroom: 661-255-1234

Advertising: 661-287-5564

Have a news tip? Let us know!

News Sections

  • Coronavirus
  • Environment
  • Politics & Government

More Sections

  • Video + Podcasts
  • Sunday Signal
  • Subscribe to Print
  • Classified Ads
  • Event Calendar

Decoda Literacy Solutions

Working together for literacy

critical thinking and problem solving skills used in literacy program in ecd

Read All About Lit

  • Impact Stories

Woman tries to solve a rubik's cube.

Literacy and Problem Solving

Problem-solving skills are closely tied to literacy. Literacy reaches beyond the process of reading to comprehension, critical thinking and decision making, which result in better problem-solving skills. These skills are highly valued not only in the workplace, but in many aspects of our lives. We use problem solving to make difficult situations better and find new ways of doing things.

“Reading is essentially a problem-solving task. Comprehending what is read, like problem solving, requires effort, planning, self-monitoring, strategy selection, and reflection.” – Q.E.D. Foundation

Models for Problem Solving

There are many problem-solving models and strategies to help teach critical thinking. They provide step-by-step processes we can use to address a challenge. One example is the ladder of inference. Developed by two Harvard professors in the 1970s, the ladder of inference explains the process of our assumptions and ultimately how we make our decisions. The video below offers a good explanation.

Understanding how the ladder of inference works allows us to back-up and reinterpret the data we’re receiving. By examining our own ladders, we can check our assumptions and help solve disagreements and other problems we encounter day to day.

Another problem-solving and decision-making model is the SODAS method. This stands for Situation, Options, Disadvantages, Advantages, and Select the best option. This method can be used with learners by presenting different real-life scenarios and by using role-playing. Students are presented with a problem situation, identify possible options to solve the problem and weigh the disadvantages and advantages of each option before coming to a solution. This method has been found to be particularly helpful in social situations.

We know that literacy includes much more than reading and writing. Problem solving is a life skill that assists us at home, at work and in the community. It is an essential component of the modern definition of literacy. For more problem-solving strategies and methods, check out the resources below.

  • Integrating Digital Literacy and Problem Solving into Instruction
  • Introduction to Problem Solving Skills
  • The Ladder of Inference
  • Problem solved! A guide for employees and learners
  • Problem Solving with SODAS
  • Reading as Problem Solving/Impact of Higher Order Thinking
  • UP Skills for Work – Problem Solving

Related Blog Posts

Beyond words: multimodal literacy.

Decode and create meaning in the digital era with multimodal literacy — embracing words, images and more.

The Active View of Reading

Get to know one of the latest models of reading in English, the Active View of Reading.

Learning Literacy with Laughter

Tactful use of humour in the classroom can help learners in many ways.

Subscribe for updates for the latest literacy news, events and resources.

Advertisement

Advertisement

T/E design based learning: assessing student critical thinking and problem solving abilities

  • Published: 07 July 2020
  • Volume 32 , pages 267–285, ( 2022 )

Cite this article

  • Susheela Shanta   ORCID: orcid.org/0000-0002-2387-6318 1 &
  • John G. Wells   ORCID: orcid.org/0000-0002-9730-577X 2  

2842 Accesses

29 Citations

2 Altmetric

Explore all metrics

The research presented is of an investigation into the critical thinking (CT) and problem solving (PS) abilities used by high school technology and engineering (T/E) students when attempting to achieve a viable solution for an authentic engineering design-no-make challenge presented outside the context of the classroom in which their STEM content was first learned. Five key abilities were identified and assessed as indicators of a student’s ability to problem solving within the context of authentic engineering design. Findings from data analyses indicates T/E students who acquire STEM content through T/E design base learning demonstrate significantly better CT and PS abilities in designing an engineering solution compared with a hypothesized mean for students receiving their STEM content via traditional classroom instruction. Furthermore, student abilities associated with selecting and utilizing relevant science and math content and practices, and communicating logical reasoning in their design solution were found to be critical to successful problem solving.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

critical thinking and problem solving skills used in literacy program in ecd

Atman, C. J., & Bursic, K. M. (1998). Verbal protocol analysis as a method to document engineering student design processes. Journal of Engineering Education, 87, 121–132.

Google Scholar  

Ball, J. L., & Christensen, B. T. (2019). Advancing an understanding of design cognition and design metacognition: Progress and prospects. Design Studies, 65, 35–59.

Barak, M., & Assal, M. (2018). Robotics and STEM learning: students’ achievements in assignments according to the P3 Task Taxonomy—practice, problem solving, and projects. International Journal of Technology and Design Education , 28 , 121–144. https://doi.org/10.1007/s10798-016-9385-9 .

Article   Google Scholar  

Barak, M., & Assal, M. (2018). Robotics and STEM learning: Student achievements in assignments according to the P3 task taxonomy-practice, problem solving and projects. International Journal of Technology and Design Education, 28, 121–144. https://doi.org/10.1007/s10798-016-9385-9 .

Barlex, D. (2003). Considering the impact of design and technology on society—The experience of the Young Foresight project. In J. R. Dakers & M. J. Devries (Eds.), The place of design and technology in the curriculum PATT conference 2003 (pp. 142–147). Glasgow: University of Glasgow.

Barlex, D. M., & Trebell, D. (2008). Design-without-make: Challenging the conventional approach to teaching and learning in a design and technology classroom. International Journal of Technology and Design Education, 18 (2), 119–138.

Becker, K., & Mentzer, N. (2015). Engineering design thinking: High school students’ performance and knowledge. In Interactive collaborative learning (ICL), 2015 international conference on (pp. 5–12).

Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain, mind, experience and school . Washington, DC: National Academy Press.

Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18 (1), 32–42.

Brown, A. L., & Kane, M. J. (1988). Preschool children can learn to transfer: Learning to learn and learning from example. Cognitive Psychology, 20 (4), 493–523. https://doi.org/10.1016/0010-0285(88)90014-X .

Budny, D., LeBold, W., & Bjedov, G. (1998). Assessment of the impact of freshman engineering courses. Journal of Engineering Education, 87 (4), 405–411.

Cajas, F. (2001). The science/technology interaction: Implications for science literacy. Journal of Research in Science Teaching, 38 (7), 715–729.

Chi, T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5 (2), 121–152.

Cumming, G. (2014). The new statistics: Why and how. Psychological Science, 25 (1), 7–29.

Daugherty, J., Mentzer, N., & Kelley, T. (2011, April). Technology design and engineering design: Is there a difference? Paper presented at IAJC-ASEE joint international conference, Harford, CT .

Docktor, J. (2009). Development and validation of a physics problem solving assessment rubric (doctoral dissertation). University of Minnesota, Minneapolis and St. Paul, Minnesota.

Docktor, J., & Heller, K. (2009). Robust assessment instrument for student problem solving. In Proceedings of the NARST 2009 annual meeting . Retrieved from http://groups.physics.umn.edu/physed/People/Docktor/research.htm#Research_Documents .

Docktor, J. L., Dornfeld, J., Frodermann, E., Heller, K., Hsu, L., Jackson, K. A., et al. (2016). Assessing student written problem solutions: A problem-solving rubric with application to introductory physics. Physical Review Physics Education Research , 12 , 301–318.

Efron, B. (1994). Missing data, imputation and the bootstrap. Journal of American Statistics Association, 89, 463–479.

Felder, R. M., & Brent, R. (2016). Teaching and learning STEM: A practical guide . San Francisco: Jossey-Bass.

Festinger, L. (1962). A theory of cognitive dissonance (Vol. 2). Stanford: Stanford University Press.

Fortus, D., Dershimer, R. C., Krajcik, J., Marx, R. W., & Mamlok-Naaman, R. (2004). Design based science and student learning. Journal of Research in Science Teaching, 41 (10), 1081–1110.

Gagne, R. M., Wager, W. W., Golas, K. C., & Keller, J. M. (2004). Principles of instructional design (5th ed.). Independence: Cengage Learning.

Gero, J. S., & Kan, J. (2016). Empirical results from measuring design creativity: Use of an augmented coding scheme in protocol analysis. Paper presented at the 4th international conference on design creativity, Atlanta, Georgia .

Hayes, J. R. (1989). The complete problem solver (2nd ed.). Hillsdale: Lawrence Erlbaum Associates.

Heller, J. I., & Reif, F. (1984). Prescribing effective human problem solving processes: Problem description in physics. Cognition and Instruction, 1 (2), 177–216.

Heywood, J. (2005). Engineering education: Research and development in curriculum and instruction . Hoboken: Wiley.

Huber, M., & Hutchings, P. (2004). Integrative learning: Mapping the terrain . New York: The Carnegie Foundation for the Advancement of Teaching and Association of American Colleges and Universities.

Jonassen, D. H. (1997). Instructional design models for well-structured and Ill-structured problem solving learning outcomes. Educational Technology Research and Development, 45 (1), 65–94.

Jonassen, D. H. (2000). Toward a design theory of problem solving. Educational Technology Research and Development, 48 (4), 63–85.

Jonassen, D. H., Strobel, J., & Lee, C. B. (2006). Everyday problem solving in engineering: Lessons for engineering educators. Journal of Engineering Education, 95 (2), 1–14.

Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2, 130–144.

Lammi, M. (2011). Characterizing high school students’ systems thinking in engineering design through the function - behavior - structure (FBS) framework. PhD Thesis, Utah State University, Logan, Utah.

Martinez, M. E. (1998). What is problem solving? The Phi Delta Kappan, 79 (8), 605–609.

Middleton, H. (2005). Creative thinking, values and design and technology education. International Journal of Technology and Design Education, 15, 61–71.

Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: Validity and reliability. Practical Assessment, Research and Evaluation, 7, 71–81.

National Assessment Governing Board (NAGB). (2009). Science framework for the 2009 national assessment of educational progress . Washington, DC: U.S. Department of Education.

National Council of Teachers of Mathematics (NCTM). (2000). Principles and standards for school mathematics: Executive summary. Retrieved from https://www.nctm.org/uploadedFiles/Standards_and_Positions/PSSM_ExecutiveSummary.pdf .

National Research Council (NRC). (2009). Engineering in K-12 education: Understanding the status and improving the prospects . Washington, DC: The National Academies.

Newell, A., & Simon, H. A. (1972). Human problem solving . Englewood Cliffs: Prentice-Hall Inc.

New Assessment Tools for Cross-Curricular Competencies in the Domain of Problem Solving (NATCCC). (2000). Final report of project ERB-SOE2- CT98-2042 . EUROPEAN Commission, L-2926 Luxembourg.

NGSS Lead States. (2013). Next generation science standards: for states, by states . Washington, DC: The National Academies Press.

Nicholas, C., & Oak, A. (2020). Make and break details: The architecture of design-build education. Design Studies, 66, 35–53.

Ormrod, J. E. (2012). Human learning . Boston: Pearson.

Perkins, D. N., & Salomon, G. (1989). Are cognitive skills context bound? Educational Researcher, 18 (1), 16–25.

Polya, G. (1973). How to solve it . Princeton, NJ: Princeton University Press. (Original work published 1945).

Polya, G. (1980). On solving mathematical problems in high school. In S. Krulik & R. Reys (Eds.), Problem-solving in school mathematics: 1980 yearbook (pp. 1–2). Reston, VA: National Council of Teachers of Mathematics.

Pope, D., Brown, M., & Miles, S. (2015). Overloaded and underprepared: Strategies for stronger schools and healthy successful kids . San Francisco: Jossey-Bass.

Puente, S. M. G., Eijck, M. V., & Jochems, W. (2013). Exploring the effects of design-based learning characteristics on teachers and students. International Journal of Engineering Education , 29 (2), 491–503.

Puntambekar, S., & Kolodner, J. L. (2005). Towards implementing distributed scaffolding: Helping students learn science from design. Journal of Research in Science Teaching, 42 (2), 185–217.

Reeff, J. P. (1999). New assessment tools for cross - curricular competencies in the domain of problem solving. Retrieved from http://www.ppsw.rug.nl/~peschar/TSE.pdf , on January 21, 2017.

Shavelson, R., Ruiz-Primo, M. A., Li, M., & Ayala, C. C. (2003). Evaluating new approaches to assessing learning . Los Angeles: National Center for Research on Evaluation, Standards, and Student Testing.

Shavelson, R. J., Ruiz-Primo, M. A., & Wiley, E. W. (2005). Windows into the mind. Higher Education, 49 (4), 413–430.

Steif, P. S., & Dantzler, J. A. (2005). A statics concept inventory: Development and psychometric analysis. Journal of Engineering Education, 94 (4), 363–371.

Stemler, S. E. (2004). A comparison of consensus, consistency, and measurement approaches to estimating interrater reliability. Practical Assessment, Research and Evaluation, 9, 80–89.

Webb, N. (1997). Research monograph number 6: Criteria for alignment of expectations and assessments on mathematics and science education . Washington, DC: CCSSO.

Wells, J. G. (2010). Research on teaching and learning in science education: Potentials in technology education. In P. A. Reed & J. E. LaPorte (Eds.), 59th Yearbook, 2010: Research in technology education (pp. 192–217). Reston: Council on Technology Teacher Education.

Wells, J. G. (2013). Integrative STEM education at Virginia Tech: Graduate preparation for tomorrow’s leaders. Technology and Engineering Teacher, 72 (5), 28–35.

Wells, J. G. (2016a). Efficacy of the technological/engineering design approach: Imposed cognitive demands within design based biotechnology instruction. Journal of Technology Education, 27 (2), 4–20.

Wells, J. G. (2016b). PIRPOSAL model of integrative STEM education: Conceptual and pedagogical framework for classroom implementation. Technology and Engineering Teacher, 75, 12–19.

Wells, J. G. (2016c). I-STEM ED exemplar: Implementation of the PIRPOSAL© model. Technology and Engineering Teacher, 76, 16–23.

Wells, J. (2017). Design to understand: Promoting higher order thinking through T/E design based Learning. In Proceedings of the technology education New Zealand and international conference on technology education - Asia Pacific (pp. 325–339). TEMS Education Research Center, University of Waikato, New Zealand. ISBN: 978-0-9951039-0-0. https://tenzcon.org/wpcontent/uploads/2017/10/TENZICTE-2017-Proceedings.pdf .

Wells, J. G. (2019). STEM education: The potential of technology education. Chapter 11, in M. Daugherty, & V. Carter (Eds.), The most influential papers presented at the Mississippi Valley technology teacher education conference. Council on Technology and Engineering Teacher Education, 62nd Yearbook, Ball State University, Muncie, IN.

Wells, J., & Ernst, J. (2012/2015). Integrative STEM education . Blacksburg, VA: Virginia Tech: Invent the Future, School of Education. Retrieved from www.soe.vt.edu/istemed/ .

White, B. Y., & Frederiksen, J. R. (1998). Inquiry, modeling, and metacognition: Making science accessible to all students. Cognition and Instruction, 16 (1), 3–118.

Wiggins, G., & McTighe, J. (2005). Understanding by design . Alexandria: Association for Supervision and Curriculum Development.

Download references

Author information

Authors and affiliations.

Governor’s STEM Academy, Roanoke, VA, USA

Susheela Shanta

Virginia Tech, Blacksburg, VA, USA

John G. Wells

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Susheela Shanta .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Shanta, S., Wells, J.G. T/E design based learning: assessing student critical thinking and problem solving abilities. Int J Technol Des Educ 32 , 267–285 (2022). https://doi.org/10.1007/s10798-020-09608-8

Download citation

Accepted : 26 June 2020

Published : 07 July 2020

Issue Date : March 2022

DOI : https://doi.org/10.1007/s10798-020-09608-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Design based
  • Authentic problems
  • Critical thinking
  • Problem solving
  • Student abilities
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. why is Importance of Critical Thinking Skills in Education

    critical thinking and problem solving skills used in literacy program in ecd

  2. 10 Essential Critical Thinking Skills (And How to Improve Them

    critical thinking and problem solving skills used in literacy program in ecd

  3. Critical Thinking Skills

    critical thinking and problem solving skills used in literacy program in ecd

  4. Critical Thinking Definition, Skills, and Examples

    critical thinking and problem solving skills used in literacy program in ecd

  5. Three Tools for Teaching Critical Thinking and Problem Solving Skills

    critical thinking and problem solving skills used in literacy program in ecd

  6. Embedding information literacy through critical skills and a new curr…

    critical thinking and problem solving skills used in literacy program in ecd

COMMENTS

  1. Critical Thinking: A Key Foundation for Language and Literacy ...

    Critical thinking is a fundamental skills for both language and literacy success. Language − Language and critical thinking grow together and nurture each other's development.

  2. Zimbabwe's Early Childhood Development (ECD)

    21st Century Skills Early Learning Frameworks prioritize the integration of 21st Century Skills (critical thinking, problem-solving, collaboration, communication, creativity technology literacy, and social and emotional development) in early learning experiences (ZimCurriculum) ... Introduced in 2004 and subsequent statutory regulations have ...

  3. Improving 21st-century teaching skills: The key to effective 21st

    The 21st-century skillset is generally understood to encompass a range of competencies, including critical thinking, problem solving, creativity, meta-cognition, communication, digital and technological literacy, civic responsibility, and global awareness (for a review of frameworks, see Dede, 2010). And nowhere is the development of such ...

  4. ECD3: Unit 4 Online Study Guide

    Most effective ECD practitioners use an evaluation schedule like this: ECD Evaluation ... Are problem-solving and critical thinking an integral part of the programme? ... experiments with water, stories about water the possibilities are endless. Literacy, numeracy and life skills as explained in the NCS documents can be focused on individual ...

  5. PDF Digital Resources in Early Childhood Literacy Development

    The International Literacy Association maintains that rich, digital resources have a place in early childhood literacy development. Careful, intentional, and developmentally appropriate use of digital texts and tools can build young children's language and literacy skills while providing young children with opportunities to deepen their ...

  6. Evaluating storytelling activities for early literacy development

    This study reports on the effectiveness of an organised set of storytelling activities that aimed to enhance early literacy development. There were three conditions in the study. In the control condition, regular literacy development activities took place. In one experimental condition, the set of activities included oral storytelling.

  7. PDF 21st Century Knowledge and Skills in Educator Preparation

    go beyond mastery of core subjects and include 21st century knowledge and skills like critical thinking, communication, collaboration, and technology literacy. The American Association of Colleges for Teacher Education (AACTE) and the Partnership for 21st Century Skills believe new teacher candidates must be equipped with 21st

  8. PDF Implementing STEAM in the Early Childhood Classroom

    STEAM is important because it helps teachers incorporate multiple disciplines at the same time and promotes learning experiences that allow children to explore, question, research, discover, and exercise innovative building skills (Colker and Simon, 2014).

  9. Critical thinking in the preschool classroom

    Examining a pedological program aimed to develop critical thinking. The program is designed and implemented within the context of preschool education. ... Developing critical literacy through the use of narrative. Analytic Teaching, 18 (1) (1998), pp. 16 ... Teaching critical thinking and problem solving skills. The Journal of Research in ...

  10. PDF Designing an EFL Reading Program to Promote Literacy Skills, Critical

    This article details the design and implementation of a reading program in a university EFL setting as a strategy to encourage creativity, critical thinking, collaborative learning, and reading for enjoyment (Anderson & Krath-wohl, 2001; Richards & Renandya, 2002).

  11. The effectiveness of collaborative problem solving in promoting

    Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of...

  12. The Development of a Critical-Creative Reading Assessment Based on

    The critical-creative reading ability can be measured through problem-solving-based assessments because the assessment contains tasks that require students to find problems, analyze and evaluate them, and then work out the solutions ( Jonassen, 2010 ).

  13. Mission Possible: Measuring Critical Thinking and Problem Solving

    Act I: Developing Rubrics. To define what critical-thinking, problem-solving, and written communication skills would look like, we developed a rubric spelling out what these skills should involve at the 4th and 7th grade levels. Our rubric employed a 4-point scale (novice, emerging, proficient, and advanced), with 3 defined as meeting the ...

  14. The importance of critical thinking for young children

    Promote children's interests. When children are deeply vested in a topic or pursuit, they are more engaged and willing to experiment. The process of expanding their knowledge brings about a lot of opportunities for critical thinking, so to encourage this action helps your child invest in their interests.

  15. Supporting staff in implementing a new literacy program

    In Designing a whole-school literacy program, Andrew Nicholls discussed the process he undertook when revamping his school's literacy program.Here, he discusses staff professional development to support it's implementation and the program's impact on student learning outcomes. After developing a new literacy program at a small rural school, I recognised that, in order for it to achieve success ...

  16. Fostering Scientific Literacy and Critical Thinking in ...

    Scientific literacy (SL) and critical thinking (CT) are key components of science education aiming to prepare students to think and to function as responsible citizens in a world increasingly affected by science and technology (S&T). Therefore, students should be given opportunities in their science classes to be engaged in learning experiences that promote SL and CT, which may trigger the ...

  17. Critical Thinking & Problem-Solving Skills Students Need

    Critical thinking is in problem-solving, creating strategic plans, and understanding the effects of your actions. This article discusses the various critical thinking and problem-solving skills ...

  18. Creative and Critical Thinking in Primary Mathematics

    Creative and critical thinking is a cross-curriculum priority in Australian schools and features in each subject area. Reasoning tasks promote critical and creative thinking in maths Most primary teachers think of problem solving, one of the four mathematics proficiencies where children inquire into real world problems or solve open tasks.

  19. PDF Numeracy as critical thinking

    Academic staff also identified concerns about the critical thinking and problem solving abilities of students, which is consistent with findings at other higher education institutions in Australia (e.g., Chapman, 1998, Galligan & Taylor, 2008; Kemp, 1995) and internationally (e.g., Gill & O'Donoghue, 2006), as well as at the pre‐tertiary level ...

  20. The Development of a Critical-Creative Reading Assessment Based on

    students in critical-creative reading. The assessments con-tain test models and scoring guidelines. The objectives of the test focus on three skills, namely, (a) finding the problem and the root of the problem, (b) planning and executing solutions, and (c) expressing the best solution in the form of creative texts.

  21. More Than ABCs: Building the Critical Thinking Skills Your Child Needs

    Critical thinking happens when a child draws on her existing knowledge and experience, as well as on her problem-solving skills, to do things like: Compare and contrast. Explain why things happen. Evaluate ideas and form opinions. Understand the perspectives of others.

  22. Literacy and Problem Solving

    Problem-solving skills are closely tied to literacy. Literacy reaches beyond the process of reading to comprehension, critical thinking and decision making, which result in better problem-solving skills. These skills are highly valued not only in the workplace, but in many aspects of our lives. We use problem solving to make difficult ...

  23. T/E design based learning: assessing student critical thinking and

    The research presented is of an investigation into the critical thinking (CT) and problem solving (PS) abilities used by high school technology and engineering (T/E) students when attempting to achieve a viable solution for an authentic engineering design-no-make challenge presented outside the context of the classroom in which their STEM content was first learned. Five key abilities were ...