Explore Jobs

  • Jobs Near Me
  • Remote Jobs
  • Full Time Jobs
  • Part Time Jobs
  • Entry Level Jobs
  • Work From Home Jobs

Find Specific Jobs

  • $15 Per Hour Jobs
  • $20 Per Hour Jobs
  • Hiring Immediately Jobs
  • High School Jobs
  • H1b Visa Jobs

Explore Careers

  • Business And Financial
  • Architecture And Engineering
  • Computer And Mathematical

Explore Professions

  • What They Do
  • Certifications
  • Demographics

Best Companies

  • Health Care
  • Fortune 500

Explore Companies

  • CEO And Executies
  • Resume Builder
  • Career Advice
  • Explore Majors
  • Questions And Answers
  • Interview Questions

The Most Important Logical Thinking Skills (With Examples)

  • Logical Skills
  • Promotional Skills
  • Bookkeeping Skills
  • Typing Skills
  • Sales Skills
  • Science And Math Skills
  • Physical Strength And Dexterity Skills
  • Customer Service Skills
  • Instructional Skills
  • Logical-Thinking Skills
  • Mechanical Skills
  • Memorization Skills
  • Motivational Skills
  • Artistic Talent

Find a Job You Really Want In

Logical thinking skills like critical-thinking, research, and creative thinking are valuable assets in the workplace. These skills are sought after by many employers, who want employees that take into account facts and data before deciding on an important course of action. This is because such solutions will ensure the organization’s processes can continue to operate efficiently.

So, if you’re a job seeker or employee looking to explore and brush up on your logical thinking skills, you’re in luck. This article will cover examples of logical thinking skills in the workplace, as well as what you can do to showcase those skills on your resume and in interviews.

Key Takeaways:

Logical thinking is problem solving based on reasoning that follows a strictly structured progression of analysis.

Critical thinking, research, creativity, mathematics, reading, active listening, and organization are all important logical thinking skills in the workplace.

Logical thinking provides objectivity for decision making that multiple people can accept.

Deduction follows valid premises to reach a logical conclusion.

It can be very helpful to demonstrate logical thinking skills at a job interview.

The Most Important Logical Thinking Skills

What is logical thinking?

10 examples of logical thinking skills, examples of logical thinking in the workplace, what is deductive reasoning, logical thinking in a job interview, logical thinking skills faq, final thoughts.

  • Sign Up For More Advice and Jobs

Logical thinking is the ability to reason out an issue after observing and analyzing it from all angles . You can then form a conclusion that makes the most sense. It also includes the ability to take note of reactions and feedback to aid in the formation of the conclusion.

Logical thinking skills enable you to present your justification for the actions you take, the strategies you use, and the decisions you make. You can easily stand in front of your clients, peers, and supervisors and defend your product, service, and course of action if the necessity arises.

Logical thinking is an excellent way of solving complex problems. You can break the problem into smaller parts; solve them individually in a sequence, then present the complete solution. However, it is not infallible.

So, when a problem in the workplace feels overwhelming, you may want to think about it logically first.

Logical thinking skills are a skill set that enables you to reason logically when solving problems. They enable you to provide well-reasoned answers to any issues that arise. They also empower you to make decisions that most people will consider rational.

Critical-thinking skills. If you are a critical thinker, then you can analyze and evaluate a problem before making judgments. You need to improve your critical thinking process to become a logical thinker.

Your critical thinking skills will improve your ability to solve problems. You will be the go-to employee concerning crises. People can rely on you to be reasonable whenever an issue arises instead of letting biases rule you.

Research skills. If you are a good researcher , then you can search and locate data that can be useful when presenting information on your preferred subject.

The more relevant information you have about a particular subject, the more accurate your conclusions are likely to be. The sources you use must be reputable and relevant.

For this reason, your ability to ferret out information will affect how well you can reason logically.

Creative thinking skills. If you are a creative thinker , then you can find innovative solutions to problems.

You are the kind of person that can think outside the box when brainstorming ideas and potential solutions. Your thinking is not rigid. Instead, you tend to look at issues in ways other people have not thought of before.

While logical thinking is based on data and facts, that doesn’t mean it is rigid. You can creatively find ways of sourcing that data or experimenting so that you can form logical conclusions. Your strategic thinking skills will also help enable you to analyze reactions or collect feedback .

Mathematical skills. If you are skilled in mathematics , then you can work well with numbers and represent mathematical ideas using visual symbols. Your brain must be able to compute information.

Business is a numbers game. That means you must have some knowledge of mathematics. You must be able to perform basic mathematical tasks involving addition, subtractions, divisions, multiplications, etc.

So, to become a logical thinker, you must be comfortable working with numbers. You will encounter them in many business-related complex problems. And your ability to understand them will determine whether you can reach an accurate logical conclusion that helps your organization.

Reading skills. If you are a good reader , then you can make sense of the letters and symbols that you see. Your ability to read will determine your competency concerning your logical thinking and reasoning skills.

And that skill set will come in handy when you are presented with different sets of work-related statements from which you are meant to conclude. Such statements may be part of your company policy, technical manual, etc.

Active listening skills. Active listening is an important communication skill to have. If you are an active listener, then you can hear, understand what is being said, remember it, and respond to it if necessary.

Not all instructions are written. You may need to listen to someone to get the information you need to solve problems before you write it down. In that case, your active listening skills will determine how well you can remember the information so that you can use it to reason things out logically.

Information ordering skills. If you have information ordering skills, then you can arrange things based on a specified order following the set rules or conditions. These things may include mathematical operations, words, pictures, etc.

Different organizations have different business processes. The workflow in one organization will be not similar to that of another organization even if both belong to the same industry.

Your ability to order information will depend on an organization’s culture . And it will have a major impact on how you can think and reason concerning solutions to your company problems.

If you follow the wrong order, then no matter how good your problem-solving techniques are your conclusions may be wrong for your organization.

Persuasion skills. Logical thinking can be useful when persuading others, especially in the workplace.

For example, lets say one of your co-workers wants to take a project in an impulsive direction, which will increase the budget. However, after you do your research, you realize a budget increase would be impossible.

You can then use your logical thinking skills to explain the situation to your co-worker , including details facts and numbers, which will help dissuade them from making an uninformed decision.

Decision making skills. Decision making skills go hand and hand with logical thinking, as being able to think logically about solutions and research topics will make it far easier to make informed decisions.

After all, no one likes making a decision that feels like a shot in the dark, so knowing crucial information about the options aviable to you, and thinking about them logically, can improve your confidence around decision making.

Confidence skills. Confidence that stems from an emotional and irrational place will always be fragile, but when you have more knowledge available to you through logical thinking, you can be more confident in your confidence skills.

For instance, if an employee asked you to answer an important question, you will have a lot more confidence in your answer if you can think logically about it, as opposed to having an air of uncertainty.

To improve your logic skills, it would be wise to practice how to solve problems based on facts and data. Below are examples of logical thinking in the workplace that will help you understand this kind of reasoning so that you can improve your thinking:

The human resource department in your organization has determined that leadership skills are important for anyone looking to go into a senior management position. So, it decides that it needs proof of leadership before hiring anyone internally. To find the right person for the senior management position , every candidate must undertake a project that involves a team of five. Whoever leads the winning team will get the senior managerial position.

This example shows a logical conclusion that is reached by your organization’s human resource department. In this case, your HR department has utilized logical thinking to determine the best internal candidate for the senior manager position.

It could be summarized as follows:

Statement 1: People with excellent leadership skills that produce winning teams make great senior managers. Statement 2: Candidate A is an excellent leader that has produced a winning team. Conclusion: Candidate A will make an excellent senior manager .
A marketing company researches working women on behalf of one of their clients – a robotics company. They find out that these women feel overwhelmed with responsibilities at home and in the workplace. As a result, they do not have enough time to clean, take care of their children, and stay productive in the workplace. A robotics company uses this research to create a robot cleaner that can be operated remotely . Then they advertise this cleaner specifically to working women with the tag line, “Working women can do it all with a little bit of help.” As a result of this marketing campaign, their revenues double within a year.

This example shows a logical conclusion reached by a robotics company after receiving the results of marketing research on working women. In this case, logical thinking has enabled the company to come up with a new marketing strategy for their cleaning product.

Statement 1: Working women struggle to keep their homes clean. Statement 2: Robot cleaners can take over cleaning duties for women who struggle to keep their homes clean. Conclusion: Robot cleaner can help working women keep their homes clean.
CalcX. Inc. has created a customer survey concerning its new finance software. The goal of the survey is to determine what customers like best about the software. After reading through over 100 customer reviews and ratings, it emerges that 60% of customers love the new user interface because it’s easy to navigate. CalcX. Inc. then decides to improve its marketing strategy. It decides to train every salesperson to talk about the easy navigation feature and how superior it is to the competition. So, every time a client objects to the price, the sales rep could admit that it is expensive, but the excellent user interface makes up for the price. At the end of the year, it emerges that this strategy has improved sales revenues by 10%.

The above example shows how logical thinking has helped CalcX. Sell more software and improve its bottom line.

Statement 1: If the majority of customers like a particular software feature, then sales reps should use it to overcome objections and increase revenues. Statement 2: 60% of the surveyed customers like the user interface of the new software, and; they think it makes navigation easier. Conclusion: The sales reps should market the new software’s user interface and the fact that it is easy to navigate to improve the company’s bottom line.
A political candidate hires a focus group to discuss hot-button issues they feel strongly about. It emerges that the group is torn on sexual reproductive health issues, but most support the issue of internal security . However, nearly everyone is opposed to the lower wages being paid due to the current economic crisis. Based on the results of this research, the candidate decides to focus on improving the economy and security mechanisms in the country. He also decides to let go of the sexual productive health issues because it would potentially cause him to lose some support.

In this case, the political candidate has made logical conclusions on what topics he should use to campaign for his seat with minimal controversies so that he doesn’t lose many votes.

This situation could be summarized as follows:

Statement 1: Most people find sexual reproductive health issues controversial and cannot agree. Statement 2: Most people feel that the internal security of the country is in jeopardy and something should be done about it. Statement 3: Most people want higher wages and an improved economy. Statement 4: Political candidates who want to win must avoid controversy and speak up on things that matter to people. Conclusion: To win, political candidates must focus on higher wages, an improved economy, and the internal security of the country while avoiding sexual reproductive health matters.

Deductive reasoning is an aspect of logical reasoning. It is a top-down reasoning approach that enables you to form a specific logical conclusion based on generalities. Therefore, you can use one or more statements, usually referred to as premises, to conclude something.

For example:

Statement 1: All mothers are women Statement 2: Daisy is a mother. Conclusion: Daisy is a woman.

Based on the above examples, all mothers are classified as women, and since Daisy is a mother, then it’s logical to deduce that she is a woman too.

It’s worth noting though, that deductive reasoning does not always produce an accurate conclusion based on reality.

Statement 1: All caregivers in this room are nurses. Statement 2: This dog, Tom, is a caregiver . Conclusion: This dog, Tom, is a nurse .

From the above example, we have deduced that Tom, the dog, is a nurse simply because the first statement stated that all caregivers are nurses. And yet, in reality, we know that dogs cannot be nurses. They do not have the mental capacity to become engaged in the profession.

For this reason, you must bear in mind that an argument can be validly based on the conditions but it can also be unsound if some statements are based on a fallacy.

Since logical thinking is so important in the workplace, most job interviewers will want to see you demonstrate this skill at the job interview. It is very important to keep in mind your logical thinking skills when you talk about yourself at the interview.

There are many ways in which an interviewer may ask you to demonstrate your logical thinking skills. For example:

You may have to solve an example problem. If the interviewer provides you a problem similar to one you might find at your job, make sure to critically analyze the problem to deduce a solution.

You may be asked about a previous problem or conflict you had to solve. This classic question provides you the opportunity to show your skills in action, so make sure to highlight the objectivity and logic of your problem solving.

Show your logic when talking about yourself. When given the opportunity to talk about yourself, highlight how logic comes into play in your decision making. This could be in how you picked the job position, why you choose your career or education, or what it is about yourself that makes you a great candidate.

Why is it important to think logically?

It’s important to think logically because it allows you to analyze a situation and come up with a logical solution. It allows for you to reason through the important decisions and solve problems with a better understanding of what needs to be done. This is necessary for developing a strong career.

Why is logic important?

Logic is important because it helps develop critical thinking skills. Critical thinking skills are important because they help you analyze and evaluate a problem before you make a decision. It also helps you improve your problem-solving skills to allow you to make better decisions.

How do you improve your logical thinking skills?

When improving your logical thinking skills make sure you spend time on a creative hobby and practice questioning. Creative hobbies can help reduce stress levels, and lower stress leads to having an easier time focusing on tasks and making logical thinking. Creative hobbies can include things like drawing, painting, and writing.

Another way to improve your logical thinking is to start asking questions about things. Asking questions allows for you to discover new things and learn about new topics you may not have thought about before.

What are logical thinking skills you need to succeed at work?

There are many logical thinking skills you need to succeed in the workplace. Our top four picks include:

Observation

Active Listening

Problem-solving

Logical thinking skills are valuable skills to have. You need to develop them so that you can become an asset to any organization that hires you. Be sure to include them in your resume and cover letter .

And if you make it to the interview, also ensure that you highlight these skills. You can do all this by highlighting the career accomplishments that required you to use logical thinking in the workplace.

It’s Your Yale – Consider Critical Thinking Skills to Articulate Your Work Quality

How useful was this post?

Click on a star to rate it!

Average rating / 5. Vote count:

No votes so far! Be the first to rate this post.

' src=

Roger Raber has been a content writer at Zippia for over a year and has authored several hundred articles. Having retired after 28 years of teaching writing and research at both the high school and college levels, Roger enjoys providing career details that help inform people who are curious about a new job or career. Roger holds a BA in English from Cleveland State University and a MA from Marygrove college.

Recent Job Searches

  • Registered Nurse Jobs Resume Location
  • Truck Driver Jobs Resume Location
  • Call Center Representative Jobs Resume Location
  • Customer Service Representative Jobs Resume
  • Delivery Driver Jobs Resume Location
  • Warehouse Worker Jobs Resume Location
  • Account Executive Jobs Resume Location
  • Sales Associate Jobs Resume Location
  • Licensed Practical Nurse Jobs Resume Location
  • Company Driver Jobs Resume

Related posts

logical reasoning and problem solving skills

Skill Sets: What Are They?

logical reasoning and problem solving skills

Artistic Talent: Is It Nature Or Nurture?

Self-Discipline Skills

The Most Important Self-Discipline Skills (With Examples)

https://www.zippia.com/wp-content/uploads/2021/03/How-To-Improve-Your-Memory-Skills.png

How To Improve Your Memory Skills

  • Career Advice >
  • Resume Skills >
  • Defined Skills >
  • Logical Thinking Skills

Logo for College of DuPage Digital Press

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

7 Module 7: Thinking, Reasoning, and Problem-Solving

This module is about how a solid working knowledge of psychological principles can help you to think more effectively, so you can succeed in school and life. You might be inclined to believe that—because you have been thinking for as long as you can remember, because you are able to figure out the solution to many problems, because you feel capable of using logic to argue a point, because you can evaluate whether the things you read and hear make sense—you do not need any special training in thinking. But this, of course, is one of the key barriers to helping people think better. If you do not believe that there is anything wrong, why try to fix it?

The human brain is indeed a remarkable thinking machine, capable of amazing, complex, creative, logical thoughts. Why, then, are we telling you that you need to learn how to think? Mainly because one major lesson from cognitive psychology is that these capabilities of the human brain are relatively infrequently realized. Many psychologists believe that people are essentially “cognitive misers.” It is not that we are lazy, but that we have a tendency to expend the least amount of mental effort necessary. Although you may not realize it, it actually takes a great deal of energy to think. Careful, deliberative reasoning and critical thinking are very difficult. Because we seem to be successful without going to the trouble of using these skills well, it feels unnecessary to develop them. As you shall see, however, there are many pitfalls in the cognitive processes described in this module. When people do not devote extra effort to learning and improving reasoning, problem solving, and critical thinking skills, they make many errors.

As is true for memory, if you develop the cognitive skills presented in this module, you will be more successful in school. It is important that you realize, however, that these skills will help you far beyond school, even more so than a good memory will. Although it is somewhat useful to have a good memory, ten years from now no potential employer will care how many questions you got right on multiple choice exams during college. All of them will, however, recognize whether you are a logical, analytical, critical thinker. With these thinking skills, you will be an effective, persuasive communicator and an excellent problem solver.

The module begins by describing different kinds of thought and knowledge, especially conceptual knowledge and critical thinking. An understanding of these differences will be valuable as you progress through school and encounter different assignments that require you to tap into different kinds of knowledge. The second section covers deductive and inductive reasoning, which are processes we use to construct and evaluate strong arguments. They are essential skills to have whenever you are trying to persuade someone (including yourself) of some point, or to respond to someone’s efforts to persuade you. The module ends with a section about problem solving. A solid understanding of the key processes involved in problem solving will help you to handle many daily challenges.

7.1. Different kinds of thought

7.2. Reasoning and Judgment

7.3. Problem Solving

READING WITH PURPOSE

Remember and understand.

By reading and studying Module 7, you should be able to remember and describe:

  • Concepts and inferences (7.1)
  • Procedural knowledge (7.1)
  • Metacognition (7.1)
  • Characteristics of critical thinking:  skepticism; identify biases, distortions, omissions, and assumptions; reasoning and problem solving skills  (7.1)
  • Reasoning:  deductive reasoning, deductively valid argument, inductive reasoning, inductively strong argument, availability heuristic, representativeness heuristic  (7.2)
  • Fixation:  functional fixedness, mental set  (7.3)
  • Algorithms, heuristics, and the role of confirmation bias (7.3)
  • Effective problem solving sequence (7.3)

By reading and thinking about how the concepts in Module 6 apply to real life, you should be able to:

  • Identify which type of knowledge a piece of information is (7.1)
  • Recognize examples of deductive and inductive reasoning (7.2)
  • Recognize judgments that have probably been influenced by the availability heuristic (7.2)
  • Recognize examples of problem solving heuristics and algorithms (7.3)

Analyze, Evaluate, and Create

By reading and thinking about Module 6, participating in classroom activities, and completing out-of-class assignments, you should be able to:

  • Use the principles of critical thinking to evaluate information (7.1)
  • Explain whether examples of reasoning arguments are deductively valid or inductively strong (7.2)
  • Outline how you could try to solve a problem from your life using the effective problem solving sequence (7.3)

7.1. Different kinds of thought and knowledge

  • Take a few minutes to write down everything that you know about dogs.
  • Do you believe that:
  • Psychic ability exists?
  • Hypnosis is an altered state of consciousness?
  • Magnet therapy is effective for relieving pain?
  • Aerobic exercise is an effective treatment for depression?
  • UFO’s from outer space have visited earth?

On what do you base your belief or disbelief for the questions above?

Of course, we all know what is meant by the words  think  and  knowledge . You probably also realize that they are not unitary concepts; there are different kinds of thought and knowledge. In this section, let us look at some of these differences. If you are familiar with these different kinds of thought and pay attention to them in your classes, it will help you to focus on the right goals, learn more effectively, and succeed in school. Different assignments and requirements in school call on you to use different kinds of knowledge or thought, so it will be very helpful for you to learn to recognize them (Anderson, et al. 2001).

Factual and conceptual knowledge

Module 5 introduced the idea of declarative memory, which is composed of facts and episodes. If you have ever played a trivia game or watched Jeopardy on TV, you realize that the human brain is able to hold an extraordinary number of facts. Likewise, you realize that each of us has an enormous store of episodes, essentially facts about events that happened in our own lives. It may be difficult to keep that in mind when we are struggling to retrieve one of those facts while taking an exam, however. Part of the problem is that, in contradiction to the advice from Module 5, many students continue to try to memorize course material as a series of unrelated facts (picture a history student simply trying to memorize history as a set of unrelated dates without any coherent story tying them together). Facts in the real world are not random and unorganized, however. It is the way that they are organized that constitutes a second key kind of knowledge, conceptual.

Concepts are nothing more than our mental representations of categories of things in the world. For example, think about dogs. When you do this, you might remember specific facts about dogs, such as they have fur and they bark. You may also recall dogs that you have encountered and picture them in your mind. All of this information (and more) makes up your concept of dog. You can have concepts of simple categories (e.g., triangle), complex categories (e.g., small dogs that sleep all day, eat out of the garbage, and bark at leaves), kinds of people (e.g., psychology professors), events (e.g., birthday parties), and abstract ideas (e.g., justice). Gregory Murphy (2002) refers to concepts as the “glue that holds our mental life together” (p. 1). Very simply, summarizing the world by using concepts is one of the most important cognitive tasks that we do. Our conceptual knowledge  is  our knowledge about the world. Individual concepts are related to each other to form a rich interconnected network of knowledge. For example, think about how the following concepts might be related to each other: dog, pet, play, Frisbee, chew toy, shoe. Or, of more obvious use to you now, how these concepts are related: working memory, long-term memory, declarative memory, procedural memory, and rehearsal? Because our minds have a natural tendency to organize information conceptually, when students try to remember course material as isolated facts, they are working against their strengths.

One last important point about concepts is that they allow you to instantly know a great deal of information about something. For example, if someone hands you a small red object and says, “here is an apple,” they do not have to tell you, “it is something you can eat.” You already know that you can eat it because it is true by virtue of the fact that the object is an apple; this is called drawing an  inference , assuming that something is true on the basis of your previous knowledge (for example, of category membership or of how the world works) or logical reasoning.

Procedural knowledge

Physical skills, such as tying your shoes, doing a cartwheel, and driving a car (or doing all three at the same time, but don’t try this at home) are certainly a kind of knowledge. They are procedural knowledge, the same idea as procedural memory that you saw in Module 5. Mental skills, such as reading, debating, and planning a psychology experiment, are procedural knowledge, as well. In short, procedural knowledge is the knowledge how to do something (Cohen & Eichenbaum, 1993).

Metacognitive knowledge

Floyd used to think that he had a great memory. Now, he has a better memory. Why? Because he finally realized that his memory was not as great as he once thought it was. Because Floyd eventually learned that he often forgets where he put things, he finally developed the habit of putting things in the same place. (Unfortunately, he did not learn this lesson before losing at least 5 watches and a wedding ring.) Because he finally realized that he often forgets to do things, he finally started using the To Do list app on his phone. And so on. Floyd’s insights about the real limitations of his memory have allowed him to remember things that he used to forget.

All of us have knowledge about the way our own minds work. You may know that you have a good memory for people’s names and a poor memory for math formulas. Someone else might realize that they have difficulty remembering to do things, like stopping at the store on the way home. Others still know that they tend to overlook details. This knowledge about our own thinking is actually quite important; it is called metacognitive knowledge, or  metacognition . Like other kinds of thinking skills, it is subject to error. For example, in unpublished research, one of the authors surveyed about 120 General Psychology students on the first day of the term. Among other questions, the students were asked them to predict their grade in the class and report their current Grade Point Average. Two-thirds of the students predicted that their grade in the course would be higher than their GPA. (The reality is that at our college, students tend to earn lower grades in psychology than their overall GPA.) Another example: Students routinely report that they thought they had done well on an exam, only to discover, to their dismay, that they were wrong (more on that important problem in a moment). Both errors reveal a breakdown in metacognition.

The Dunning-Kruger Effect

In general, most college students probably do not study enough. For example, using data from the National Survey of Student Engagement, Fosnacht, McCormack, and Lerma (2018) reported that first-year students at 4-year colleges in the U.S. averaged less than 14 hours per week preparing for classes. The typical suggestion is that you should spend two hours outside of class for every hour in class, or 24 – 30 hours per week for a full-time student. Clearly, students in general are nowhere near that recommended mark. Many observers, including some faculty, believe that this shortfall is a result of students being too busy or lazy. Now, it may be true that many students are too busy, with work and family obligations, for example. Others, are not particularly motivated in school, and therefore might correctly be labeled lazy. A third possible explanation, however, is that some students might not think they need to spend this much time. And this is a matter of metacognition. Consider the scenario that we mentioned above, students thinking they had done well on an exam only to discover that they did not. Justin Kruger and David Dunning examined scenarios very much like this in 1999. Kruger and Dunning gave research participants tests measuring humor, logic, and grammar. Then, they asked the participants to assess their own abilities and test performance in these areas. They found that participants in general tended to overestimate their abilities, already a problem with metacognition. Importantly, the participants who scored the lowest overestimated their abilities the most. Specifically, students who scored in the bottom quarter (averaging in the 12th percentile) thought they had scored in the 62nd percentile. This has become known as the  Dunning-Kruger effect . Many individual faculty members have replicated these results with their own student on their course exams, including the authors of this book. Think about it. Some students who just took an exam and performed poorly believe that they did well before seeing their score. It seems very likely that these are the very same students who stopped studying the night before because they thought they were “done.” Quite simply, it is not just that they did not know the material. They did not know that they did not know the material. That is poor metacognition.

In order to develop good metacognitive skills, you should continually monitor your thinking and seek frequent feedback on the accuracy of your thinking (Medina, Castleberry, & Persky 2017). For example, in classes get in the habit of predicting your exam grades. As soon as possible after taking an exam, try to find out which questions you missed and try to figure out why. If you do this soon enough, you may be able to recall the way it felt when you originally answered the question. Did you feel confident that you had answered the question correctly? Then you have just discovered an opportunity to improve your metacognition. Be on the lookout for that feeling and respond with caution.

concept :  a mental representation of a category of things in the world

Dunning-Kruger effect : individuals who are less competent tend to overestimate their abilities more than individuals who are more competent do

inference : an assumption about the truth of something that is not stated. Inferences come from our prior knowledge and experience, and from logical reasoning

metacognition :  knowledge about one’s own cognitive processes; thinking about your thinking

Critical thinking

One particular kind of knowledge or thinking skill that is related to metacognition is  critical thinking (Chew, 2020). You may have noticed that critical thinking is an objective in many college courses, and thus it could be a legitimate topic to cover in nearly any college course. It is particularly appropriate in psychology, however. As the science of (behavior and) mental processes, psychology is obviously well suited to be the discipline through which you should be introduced to this important way of thinking.

More importantly, there is a particular need to use critical thinking in psychology. We are all, in a way, experts in human behavior and mental processes, having engaged in them literally since birth. Thus, perhaps more than in any other class, students typically approach psychology with very clear ideas and opinions about its subject matter. That is, students already “know” a lot about psychology. The problem is, “it ain’t so much the things we don’t know that get us into trouble. It’s the things we know that just ain’t so” (Ward, quoted in Gilovich 1991). Indeed, many of students’ preconceptions about psychology are just plain wrong. Randolph Smith (2002) wrote a book about critical thinking in psychology called  Challenging Your Preconceptions,  highlighting this fact. On the other hand, many of students’ preconceptions about psychology are just plain right! But wait, how do you know which of your preconceptions are right and which are wrong? And when you come across a research finding or theory in this class that contradicts your preconceptions, what will you do? Will you stick to your original idea, discounting the information from the class? Will you immediately change your mind? Critical thinking can help us sort through this confusing mess.

But what is critical thinking? The goal of critical thinking is simple to state (but extraordinarily difficult to achieve): it is to be right, to draw the correct conclusions, to believe in things that are true and to disbelieve things that are false. We will provide two definitions of critical thinking (or, if you like, one large definition with two distinct parts). First, a more conceptual one: Critical thinking is thinking like a scientist in your everyday life (Schmaltz, Jansen, & Wenckowski, 2017).  Our second definition is more operational; it is simply a list of skills that are essential to be a critical thinker. Critical thinking entails solid reasoning and problem solving skills; skepticism; and an ability to identify biases, distortions, omissions, and assumptions. Excellent deductive and inductive reasoning, and problem solving skills contribute to critical thinking. So, you can consider the subject matter of sections 7.2 and 7.3 to be part of critical thinking. Because we will be devoting considerable time to these concepts in the rest of the module, let us begin with a discussion about the other aspects of critical thinking.

Let’s address that first part of the definition. Scientists form hypotheses, or predictions about some possible future observations. Then, they collect data, or information (think of this as making those future observations). They do their best to make unbiased observations using reliable techniques that have been verified by others. Then, and only then, they draw a conclusion about what those observations mean. Oh, and do not forget the most important part. “Conclusion” is probably not the most appropriate word because this conclusion is only tentative. A scientist is always prepared that someone else might come along and produce new observations that would require a new conclusion be drawn. Wow! If you like to be right, you could do a lot worse than using a process like this.

A Critical Thinker’s Toolkit 

Now for the second part of the definition. Good critical thinkers (and scientists) rely on a variety of tools to evaluate information. Perhaps the most recognizable tool for critical thinking is  skepticism (and this term provides the clearest link to the thinking like a scientist definition, as you are about to see). Some people intend it as an insult when they call someone a skeptic. But if someone calls you a skeptic, if they are using the term correctly, you should consider it a great compliment. Simply put, skepticism is a way of thinking in which you refrain from drawing a conclusion or changing your mind until good evidence has been provided. People from Missouri should recognize this principle, as Missouri is known as the Show-Me State. As a skeptic, you are not inclined to believe something just because someone said so, because someone else believes it, or because it sounds reasonable. You must be persuaded by high quality evidence.

Of course, if that evidence is produced, you have a responsibility as a skeptic to change your belief. Failure to change a belief in the face of good evidence is not skepticism; skepticism has open mindedness at its core. M. Neil Browne and Stuart Keeley (2018) use the term weak sense critical thinking to describe critical thinking behaviors that are used only to strengthen a prior belief. Strong sense critical thinking, on the other hand, has as its goal reaching the best conclusion. Sometimes that means strengthening your prior belief, but sometimes it means changing your belief to accommodate the better evidence.

Many times, a failure to think critically or weak sense critical thinking is related to a  bias , an inclination, tendency, leaning, or prejudice. Everybody has biases, but many people are unaware of them. Awareness of your own biases gives you the opportunity to control or counteract them. Unfortunately, however, many people are happy to let their biases creep into their attempts to persuade others; indeed, it is a key part of their persuasive strategy. To see how these biases influence messages, just look at the different descriptions and explanations of the same events given by people of different ages or income brackets, or conservative versus liberal commentators, or by commentators from different parts of the world. Of course, to be successful, these people who are consciously using their biases must disguise them. Even undisguised biases can be difficult to identify, so disguised ones can be nearly impossible.

Here are some common sources of biases:

  • Personal values and beliefs.  Some people believe that human beings are basically driven to seek power and that they are typically in competition with one another over scarce resources. These beliefs are similar to the world-view that political scientists call “realism.” Other people believe that human beings prefer to cooperate and that, given the chance, they will do so. These beliefs are similar to the world-view known as “idealism.” For many people, these deeply held beliefs can influence, or bias, their interpretations of such wide ranging situations as the behavior of nations and their leaders or the behavior of the driver in the car ahead of you. For example, if your worldview is that people are typically in competition and someone cuts you off on the highway, you may assume that the driver did it purposely to get ahead of you. Other types of beliefs about the way the world is or the way the world should be, for example, political beliefs, can similarly become a significant source of bias.
  • Racism, sexism, ageism and other forms of prejudice and bigotry.  These are, sadly, a common source of bias in many people. They are essentially a special kind of “belief about the way the world is.” These beliefs—for example, that women do not make effective leaders—lead people to ignore contradictory evidence (examples of effective women leaders, or research that disputes the belief) and to interpret ambiguous evidence in a way consistent with the belief.
  • Self-interest.  When particular people benefit from things turning out a certain way, they can sometimes be very susceptible to letting that interest bias them. For example, a company that will earn a profit if they sell their product may have a bias in the way that they give information about their product. A union that will benefit if its members get a generous contract might have a bias in the way it presents information about salaries at competing organizations. (Note that our inclusion of examples describing both companies and unions is an explicit attempt to control for our own personal biases). Home buyers are often dismayed to discover that they purchased their dream house from someone whose self-interest led them to lie about flooding problems in the basement or back yard. This principle, the biasing power of self-interest, is likely what led to the famous phrase  Caveat Emptor  (let the buyer beware) .  

Knowing that these types of biases exist will help you evaluate evidence more critically. Do not forget, though, that people are not always keen to let you discover the sources of biases in their arguments. For example, companies or political organizations can sometimes disguise their support of a research study by contracting with a university professor, who comes complete with a seemingly unbiased institutional affiliation, to conduct the study.

People’s biases, conscious or unconscious, can lead them to make omissions, distortions, and assumptions that undermine our ability to correctly evaluate evidence. It is essential that you look for these elements. Always ask, what is missing, what is not as it appears, and what is being assumed here? For example, consider this (fictional) chart from an ad reporting customer satisfaction at 4 local health clubs.

logical reasoning and problem solving skills

Clearly, from the results of the chart, one would be tempted to give Club C a try, as customer satisfaction is much higher than for the other 3 clubs.

There are so many distortions and omissions in this chart, however, that it is actually quite meaningless. First, how was satisfaction measured? Do the bars represent responses to a survey? If so, how were the questions asked? Most importantly, where is the missing scale for the chart? Although the differences look quite large, are they really?

Well, here is the same chart, with a different scale, this time labeled:

logical reasoning and problem solving skills

Club C is not so impressive any more, is it? In fact, all of the health clubs have customer satisfaction ratings (whatever that means) between 85% and 88%. In the first chart, the entire scale of the graph included only the percentages between 83 and 89. This “judicious” choice of scale—some would call it a distortion—and omission of that scale from the chart make the tiny differences among the clubs seem important, however.

Also, in order to be a critical thinker, you need to learn to pay attention to the assumptions that underlie a message. Let us briefly illustrate the role of assumptions by touching on some people’s beliefs about the criminal justice system in the US. Some believe that a major problem with our judicial system is that many criminals go free because of legal technicalities. Others believe that a major problem is that many innocent people are convicted of crimes. The simple fact is, both types of errors occur. A person’s conclusion about which flaw in our judicial system is the greater tragedy is based on an assumption about which of these is the more serious error (letting the guilty go free or convicting the innocent). This type of assumption is called a value assumption (Browne and Keeley, 2018). It reflects the differences in values that people develop, differences that may lead us to disregard valid evidence that does not fit in with our particular values.

Oh, by the way, some students probably noticed this, but the seven tips for evaluating information that we shared in Module 1 are related to this. Actually, they are part of this section. The tips are, to a very large degree, set of ideas you can use to help you identify biases, distortions, omissions, and assumptions. If you do not remember this section, we strongly recommend you take a few minutes to review it.

skepticism :  a way of thinking in which you refrain from drawing a conclusion or changing your mind until good evidence has been provided

bias : an inclination, tendency, leaning, or prejudice

  • Which of your beliefs (or disbeliefs) from the Activate exercise for this section were derived from a process of critical thinking? If some of your beliefs were not based on critical thinking, are you willing to reassess these beliefs? If the answer is no, why do you think that is? If the answer is yes, what concrete steps will you take?

7.2 Reasoning and Judgment

  • What percentage of kidnappings are committed by strangers?
  • Which area of the house is riskiest: kitchen, bathroom, or stairs?
  • What is the most common cancer in the US?
  • What percentage of workplace homicides are committed by co-workers?

An essential set of procedural thinking skills is  reasoning , the ability to generate and evaluate solid conclusions from a set of statements or evidence. You should note that these conclusions (when they are generated instead of being evaluated) are one key type of inference that we described in Section 7.1. There are two main types of reasoning, deductive and inductive.

Deductive reasoning

Suppose your teacher tells you that if you get an A on the final exam in a course, you will get an A for the whole course. Then, you get an A on the final exam. What will your final course grade be? Most people can see instantly that you can conclude with certainty that you will get an A for the course. This is a type of reasoning called  deductive reasoning , which is defined as reasoning in which a conclusion is guaranteed to be true as long as the statements leading to it are true. The three statements can be listed as an  argument , with two beginning statements and a conclusion:

Statement 1: If you get an A on the final exam, you will get an A for the course

Statement 2: You get an A on the final exam

Conclusion: You will get an A for the course

This particular arrangement, in which true beginning statements lead to a guaranteed true conclusion, is known as a  deductively valid argument . Although deductive reasoning is often the subject of abstract, brain-teasing, puzzle-like word problems, it is actually an extremely important type of everyday reasoning. It is just hard to recognize sometimes. For example, imagine that you are looking for your car keys and you realize that they are either in the kitchen drawer or in your book bag. After looking in the kitchen drawer, you instantly know that they must be in your book bag. That conclusion results from a simple deductive reasoning argument. In addition, solid deductive reasoning skills are necessary for you to succeed in the sciences, philosophy, math, computer programming, and any endeavor involving the use of logic to persuade others to your point of view or to evaluate others’ arguments.

Cognitive psychologists, and before them philosophers, have been quite interested in deductive reasoning, not so much for its practical applications, but for the insights it can offer them about the ways that human beings think. One of the early ideas to emerge from the examination of deductive reasoning is that people learn (or develop) mental versions of rules that allow them to solve these types of reasoning problems (Braine, 1978; Braine, Reiser, & Rumain, 1984). The best way to see this point of view is to realize that there are different possible rules, and some of them are very simple. For example, consider this rule of logic:

therefore q

Logical rules are often presented abstractly, as letters, in order to imply that they can be used in very many specific situations. Here is a concrete version of the of the same rule:

I’ll either have pizza or a hamburger for dinner tonight (p or q)

I won’t have pizza (not p)

Therefore, I’ll have a hamburger (therefore q)

This kind of reasoning seems so natural, so easy, that it is quite plausible that we would use a version of this rule in our daily lives. At least, it seems more plausible than some of the alternative possibilities—for example, that we need to have experience with the specific situation (pizza or hamburger, in this case) in order to solve this type of problem easily. So perhaps there is a form of natural logic (Rips, 1990) that contains very simple versions of logical rules. When we are faced with a reasoning problem that maps onto one of these rules, we use the rule.

But be very careful; things are not always as easy as they seem. Even these simple rules are not so simple. For example, consider the following rule. Many people fail to realize that this rule is just as valid as the pizza or hamburger rule above.

if p, then q

therefore, not p

Concrete version:

If I eat dinner, then I will have dessert

I did not have dessert

Therefore, I did not eat dinner

The simple fact is, it can be very difficult for people to apply rules of deductive logic correctly; as a result, they make many errors when trying to do so. Is this a deductively valid argument or not?

Students who like school study a lot

Students who study a lot get good grades

Jane does not like school

Therefore, Jane does not get good grades

Many people are surprised to discover that this is not a logically valid argument; the conclusion is not guaranteed to be true from the beginning statements. Although the first statement says that students who like school study a lot, it does NOT say that students who do not like school do not study a lot. In other words, it may very well be possible to study a lot without liking school. Even people who sometimes get problems like this right might not be using the rules of deductive reasoning. Instead, they might just be making judgments for examples they know, in this case, remembering instances of people who get good grades despite not liking school.

Making deductive reasoning even more difficult is the fact that there are two important properties that an argument may have. One, it can be valid or invalid (meaning that the conclusion does or does not follow logically from the statements leading up to it). Two, an argument (or more correctly, its conclusion) can be true or false. Here is an example of an argument that is logically valid, but has a false conclusion (at least we think it is false).

Either you are eleven feet tall or the Grand Canyon was created by a spaceship crashing into the earth.

You are not eleven feet tall

Therefore the Grand Canyon was created by a spaceship crashing into the earth

This argument has the exact same form as the pizza or hamburger argument above, making it is deductively valid. The conclusion is so false, however, that it is absurd (of course, the reason the conclusion is false is that the first statement is false). When people are judging arguments, they tend to not observe the difference between deductive validity and the empirical truth of statements or conclusions. If the elements of an argument happen to be true, people are likely to judge the argument logically valid; if the elements are false, they will very likely judge it invalid (Markovits & Bouffard-Bouchard, 1992; Moshman & Franks, 1986). Thus, it seems a stretch to say that people are using these logical rules to judge the validity of arguments. Many psychologists believe that most people actually have very limited deductive reasoning skills (Johnson-Laird, 1999). They argue that when faced with a problem for which deductive logic is required, people resort to some simpler technique, such as matching terms that appear in the statements and the conclusion (Evans, 1982). This might not seem like a problem, but what if reasoners believe that the elements are true and they happen to be wrong; they will would believe that they are using a form of reasoning that guarantees they are correct and yet be wrong.

deductive reasoning :  a type of reasoning in which the conclusion is guaranteed to be true any time the statements leading up to it are true

argument :  a set of statements in which the beginning statements lead to a conclusion

deductively valid argument :  an argument for which true beginning statements guarantee that the conclusion is true

Inductive reasoning and judgment

Every day, you make many judgments about the likelihood of one thing or another. Whether you realize it or not, you are practicing  inductive reasoning   on a daily basis. In inductive reasoning arguments, a conclusion is likely whenever the statements preceding it are true. The first thing to notice about inductive reasoning is that, by definition, you can never be sure about your conclusion; you can only estimate how likely the conclusion is. Inductive reasoning may lead you to focus on Memory Encoding and Recoding when you study for the exam, but it is possible the instructor will ask more questions about Memory Retrieval instead. Unlike deductive reasoning, the conclusions you reach through inductive reasoning are only probable, not certain. That is why scientists consider inductive reasoning weaker than deductive reasoning. But imagine how hard it would be for us to function if we could not act unless we were certain about the outcome.

Inductive reasoning can be represented as logical arguments consisting of statements and a conclusion, just as deductive reasoning can be. In an inductive argument, you are given some statements and a conclusion (or you are given some statements and must draw a conclusion). An argument is  inductively strong   if the conclusion would be very probable whenever the statements are true. So, for example, here is an inductively strong argument:

  • Statement #1: The forecaster on Channel 2 said it is going to rain today.
  • Statement #2: The forecaster on Channel 5 said it is going to rain today.
  • Statement #3: It is very cloudy and humid.
  • Statement #4: You just heard thunder.
  • Conclusion (or judgment): It is going to rain today.

Think of the statements as evidence, on the basis of which you will draw a conclusion. So, based on the evidence presented in the four statements, it is very likely that it will rain today. Will it definitely rain today? Certainly not. We can all think of times that the weather forecaster was wrong.

A true story: Some years ago psychology student was watching a baseball playoff game between the St. Louis Cardinals and the Los Angeles Dodgers. A graphic on the screen had just informed the audience that the Cardinal at bat, (Hall of Fame shortstop) Ozzie Smith, a switch hitter batting left-handed for this plate appearance, had never, in nearly 3000 career at-bats, hit a home run left-handed. The student, who had just learned about inductive reasoning in his psychology class, turned to his companion (a Cardinals fan) and smugly said, “It is an inductively strong argument that Ozzie Smith will not hit a home run.” He turned back to face the television just in time to watch the ball sail over the right field fence for a home run. Although the student felt foolish at the time, he was not wrong. It was an inductively strong argument; 3000 at-bats is an awful lot of evidence suggesting that the Wizard of Ozz (as he was known) would not be hitting one out of the park (think of each at-bat without a home run as a statement in an inductive argument). Sadly (for the die-hard Cubs fan and Cardinals-hating student), despite the strength of the argument, the conclusion was wrong.

Given the possibility that we might draw an incorrect conclusion even with an inductively strong argument, we really want to be sure that we do, in fact, make inductively strong arguments. If we judge something probable, it had better be probable. If we judge something nearly impossible, it had better not happen. Think of inductive reasoning, then, as making reasonably accurate judgments of the probability of some conclusion given a set of evidence.

We base many decisions in our lives on inductive reasoning. For example:

Statement #1: Psychology is not my best subject

Statement #2: My psychology instructor has a reputation for giving difficult exams

Statement #3: My first psychology exam was much harder than I expected

Judgment: The next exam will probably be very difficult.

Decision: I will study tonight instead of watching Netflix.

Some other examples of judgments that people commonly make in a school context include judgments of the likelihood that:

  • A particular class will be interesting/useful/difficult
  • You will be able to finish writing a paper by next week if you go out tonight
  • Your laptop’s battery will last through the next trip to the library
  • You will not miss anything important if you skip class tomorrow
  • Your instructor will not notice if you skip class tomorrow
  • You will be able to find a book that you will need for a paper
  • There will be an essay question about Memory Encoding on the next exam

Tversky and Kahneman (1983) recognized that there are two general ways that we might make these judgments; they termed them extensional (i.e., following the laws of probability) and intuitive (i.e., using shortcuts or heuristics, see below). We will use a similar distinction between Type 1 and Type 2 thinking, as described by Keith Stanovich and his colleagues (Evans and Stanovich, 2013; Stanovich and West, 2000). Type 1 thinking is fast, automatic, effortful, and emotional. In fact, it is hardly fair to call it reasoning at all, as judgments just seem to pop into one’s head. Type 2 thinking , on the other hand, is slow, effortful, and logical. So obviously, it is more likely to lead to a correct judgment, or an optimal decision. The problem is, we tend to over-rely on Type 1. Now, we are not saying that Type 2 is the right way to go for every decision or judgment we make. It seems a bit much, for example, to engage in a step-by-step logical reasoning procedure to decide whether we will have chicken or fish for dinner tonight.

Many bad decisions in some very important contexts, however, can be traced back to poor judgments of the likelihood of certain risks or outcomes that result from the use of Type 1 when a more logical reasoning process would have been more appropriate. For example:

Statement #1: It is late at night.

Statement #2: Albert has been drinking beer for the past five hours at a party.

Statement #3: Albert is not exactly sure where he is or how far away home is.

Judgment: Albert will have no difficulty walking home.

Decision: He walks home alone.

As you can see in this example, the three statements backing up the judgment do not really support it. In other words, this argument is not inductively strong because it is based on judgments that ignore the laws of probability. What are the chances that someone facing these conditions will be able to walk home alone easily? And one need not be drunk to make poor decisions based on judgments that just pop into our heads.

The truth is that many of our probability judgments do not come very close to what the laws of probability say they should be. Think about it. In order for us to reason in accordance with these laws, we would need to know the laws of probability, which would allow us to calculate the relationship between particular pieces of evidence and the probability of some outcome (i.e., how much likelihood should change given a piece of evidence), and we would have to do these heavy math calculations in our heads. After all, that is what Type 2 requires. Needless to say, even if we were motivated, we often do not even know how to apply Type 2 reasoning in many cases.

So what do we do when we don’t have the knowledge, skills, or time required to make the correct mathematical judgment? Do we hold off and wait until we can get better evidence? Do we read up on probability and fire up our calculator app so we can compute the correct probability? Of course not. We rely on Type 1 thinking. We “wing it.” That is, we come up with a likelihood estimate using some means at our disposal. Psychologists use the term heuristic to describe the type of “winging it” we are talking about. A  heuristic   is a shortcut strategy that we use to make some judgment or solve some problem (see Section 7.3). Heuristics are easy and quick, think of them as the basic procedures that are characteristic of Type 1.  They can absolutely lead to reasonably good judgments and decisions in some situations (like choosing between chicken and fish for dinner). They are, however, far from foolproof. There are, in fact, quite a lot of situations in which heuristics can lead us to make incorrect judgments, and in many cases the decisions based on those judgments can have serious consequences.

Let us return to the activity that begins this section. You were asked to judge the likelihood (or frequency) of certain events and risks. You were free to come up with your own evidence (or statements) to make these judgments. This is where a heuristic crops up. As a judgment shortcut, we tend to generate specific examples of those very events to help us decide their likelihood or frequency. For example, if we are asked to judge how common, frequent, or likely a particular type of cancer is, many of our statements would be examples of specific cancer cases:

Statement #1: Andy Kaufman (comedian) had lung cancer.

Statement #2: Colin Powell (US Secretary of State) had prostate cancer.

Statement #3: Bob Marley (musician) had skin and brain cancer

Statement #4: Sandra Day O’Connor (Supreme Court Justice) had breast cancer.

Statement #5: Fred Rogers (children’s entertainer) had stomach cancer.

Statement #6: Robin Roberts (news anchor) had breast cancer.

Statement #7: Bette Davis (actress) had breast cancer.

Judgment: Breast cancer is the most common type.

Your own experience or memory may also tell you that breast cancer is the most common type. But it is not (although it is common). Actually, skin cancer is the most common type in the US. We make the same types of misjudgments all the time because we do not generate the examples or evidence according to their actual frequencies or probabilities. Instead, we have a tendency (or bias) to search for the examples in memory; if they are easy to retrieve, we assume that they are common. To rephrase this in the language of the heuristic, events seem more likely to the extent that they are available to memory. This bias has been termed the  availability heuristic   (Kahneman and Tversky, 1974).

The fact that we use the availability heuristic does not automatically mean that our judgment is wrong. The reason we use heuristics in the first place is that they work fairly well in many cases (and, of course that they are easy to use). So, the easiest examples to think of sometimes are the most common ones. Is it more likely that a member of the U.S. Senate is a man or a woman? Most people have a much easier time generating examples of male senators. And as it turns out, the U.S. Senate has many more men than women (74 to 26 in 2020). In this case, then, the availability heuristic would lead you to make the correct judgment; it is far more likely that a senator would be a man.

In many other cases, however, the availability heuristic will lead us astray. This is because events can be memorable for many reasons other than their frequency. Section 5.2, Encoding Meaning, suggested that one good way to encode the meaning of some information is to form a mental image of it. Thus, information that has been pictured mentally will be more available to memory. Indeed, an event that is vivid and easily pictured will trick many people into supposing that type of event is more common than it actually is. Repetition of information will also make it more memorable. So, if the same event is described to you in a magazine, on the evening news, on a podcast that you listen to, and in your Facebook feed; it will be very available to memory. Again, the availability heuristic will cause you to misperceive the frequency of these types of events.

Most interestingly, information that is unusual is more memorable. Suppose we give you the following list of words to remember: box, flower, letter, platypus, oven, boat, newspaper, purse, drum, car. Very likely, the easiest word to remember would be platypus, the unusual one. The same thing occurs with memories of events. An event may be available to memory because it is unusual, yet the availability heuristic leads us to judge that the event is common. Did you catch that? In these cases, the availability heuristic makes us think the exact opposite of the true frequency. We end up thinking something is common because it is unusual (and therefore memorable). Yikes.

The misapplication of the availability heuristic sometimes has unfortunate results. For example, if you went to K-12 school in the US over the past 10 years, it is extremely likely that you have participated in lockdown and active shooter drills. Of course, everyone is trying to prevent the tragedy of another school shooting. And believe us, we are not trying to minimize how terrible the tragedy is. But the truth of the matter is, school shootings are extremely rare. Because the federal government does not keep a database of school shootings, the Washington Post has maintained their own running tally. Between 1999 and January 2020 (the date of the most recent school shooting with a death in the US at of the time this paragraph was written), the Post reported a total of 254 people died in school shootings in the US. Not 254 per year, 254 total. That is an average of 12 per year. Of course, that is 254 people who should not have died (particularly because many were children), but in a country with approximately 60,000,000 students and teachers, this is a very small risk.

But many students and teachers are terrified that they will be victims of school shootings because of the availability heuristic. It is so easy to think of examples (they are very available to memory) that people believe the event is very common. It is not. And there is a downside to this. We happen to believe that there is an enormous gun violence problem in the United States. According the the Centers for Disease Control and Prevention, there were 39,773 firearm deaths in the US in 2017. Fifteen of those deaths were in school shootings, according to the Post. 60% of those deaths were suicides. When people pay attention to the school shooting risk (low), they often fail to notice the much larger risk.

And examples like this are by no means unique. The authors of this book have been teaching psychology since the 1990’s. We have been able to make the exact same arguments about the misapplication of the availability heuristics and keep them current by simply swapping out for the “fear of the day.” In the 1990’s it was children being kidnapped by strangers (it was known as “stranger danger”) despite the facts that kidnappings accounted for only 2% of the violent crimes committed against children, and only 24% of kidnappings are committed by strangers (US Department of Justice, 2007). This fear overlapped with the fear of terrorism that gripped the country after the 2001 terrorist attacks on the World Trade Center and US Pentagon and still plagues the population of the US somewhat in 2020. After a well-publicized, sensational act of violence, people are extremely likely to increase their estimates of the chances that they, too, will be victims of terror. Think about the reality, however. In October of 2001, a terrorist mailed anthrax spores to members of the US government and a number of media companies. A total of five people died as a result of this attack. The nation was nearly paralyzed by the fear of dying from the attack; in reality the probability of an individual person dying was 0.00000002.

The availability heuristic can lead you to make incorrect judgments in a school setting as well. For example, suppose you are trying to decide if you should take a class from a particular math professor. You might try to make a judgment of how good a teacher she is by recalling instances of friends and acquaintances making comments about her teaching skill. You may have some examples that suggest that she is a poor teacher very available to memory, so on the basis of the availability heuristic you judge her a poor teacher and decide to take the class from someone else. What if, however, the instances you recalled were all from the same person, and this person happens to be a very colorful storyteller? The subsequent ease of remembering the instances might not indicate that the professor is a poor teacher after all.

Although the availability heuristic is obviously important, it is not the only judgment heuristic we use. Amos Tversky and Daniel Kahneman examined the role of heuristics in inductive reasoning in a long series of studies. Kahneman received a Nobel Prize in Economics for this research in 2002, and Tversky would have certainly received one as well if he had not died of melanoma at age 59 in 1996 (Nobel Prizes are not awarded posthumously). Kahneman and Tversky demonstrated repeatedly that people do not reason in ways that are consistent with the laws of probability. They identified several heuristic strategies that people use instead to make judgments about likelihood. The importance of this work for economics (and the reason that Kahneman was awarded the Nobel Prize) is that earlier economic theories had assumed that people do make judgments rationally, that is, in agreement with the laws of probability.

Another common heuristic that people use for making judgments is the  representativeness heuristic (Kahneman & Tversky 1973). Suppose we describe a person to you. He is quiet and shy, has an unassuming personality, and likes to work with numbers. Is this person more likely to be an accountant or an attorney? If you said accountant, you were probably using the representativeness heuristic. Our imaginary person is judged likely to be an accountant because he resembles, or is representative of the concept of, an accountant. When research participants are asked to make judgments such as these, the only thing that seems to matter is the representativeness of the description. For example, if told that the person described is in a room that contains 70 attorneys and 30 accountants, participants will still assume that he is an accountant.

inductive reasoning :  a type of reasoning in which we make judgments about likelihood from sets of evidence

inductively strong argument :  an inductive argument in which the beginning statements lead to a conclusion that is probably true

heuristic :  a shortcut strategy that we use to make judgments and solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

availability heuristic :  judging the frequency or likelihood of some event type according to how easily examples of the event can be called to mind (i.e., how available they are to memory)

representativeness heuristic:   judging the likelihood that something is a member of a category on the basis of how much it resembles a typical category member (i.e., how representative it is of the category)

Type 1 thinking : fast, automatic, and emotional thinking.

Type 2 thinking : slow, effortful, and logical thinking.

  • What percentage of workplace homicides are co-worker violence?

Many people get these questions wrong. The answers are 10%; stairs; skin; 6%. How close were your answers? Explain how the availability heuristic might have led you to make the incorrect judgments.

  • Can you think of some other judgments that you have made (or beliefs that you have) that might have been influenced by the availability heuristic?

7.3 Problem Solving

  • Please take a few minutes to list a number of problems that you are facing right now.
  • Now write about a problem that you recently solved.
  • What is your definition of a problem?

Mary has a problem. Her daughter, ordinarily quite eager to please, appears to delight in being the last person to do anything. Whether getting ready for school, going to piano lessons or karate class, or even going out with her friends, she seems unwilling or unable to get ready on time. Other people have different kinds of problems. For example, many students work at jobs, have numerous family commitments, and are facing a course schedule full of difficult exams, assignments, papers, and speeches. How can they find enough time to devote to their studies and still fulfill their other obligations? Speaking of students and their problems: Show that a ball thrown vertically upward with initial velocity v0 takes twice as much time to return as to reach the highest point (from Spiegel, 1981).

These are three very different situations, but we have called them all problems. What makes them all the same, despite the differences? A psychologist might define a  problem   as a situation with an initial state, a goal state, and a set of possible intermediate states. Somewhat more meaningfully, we might consider a problem a situation in which you are in here one state (e.g., daughter is always late), you want to be there in another state (e.g., daughter is not always late), and with no obvious way to get from here to there. Defined this way, each of the three situations we outlined can now be seen as an example of the same general concept, a problem. At this point, you might begin to wonder what is not a problem, given such a general definition. It seems that nearly every non-routine task we engage in could qualify as a problem. As long as you realize that problems are not necessarily bad (it can be quite fun and satisfying to rise to the challenge and solve a problem), this may be a useful way to think about it.

Can we identify a set of problem-solving skills that would apply to these very different kinds of situations? That task, in a nutshell, is a major goal of this section. Let us try to begin to make sense of the wide variety of ways that problems can be solved with an important observation: the process of solving problems can be divided into two key parts. First, people have to notice, comprehend, and represent the problem properly in their minds (called  problem representation ). Second, they have to apply some kind of solution strategy to the problem. Psychologists have studied both of these key parts of the process in detail.

When you first think about the problem-solving process, you might guess that most of our difficulties would occur because we are failing in the second step, the application of strategies. Although this can be a significant difficulty much of the time, the more important source of difficulty is probably problem representation. In short, we often fail to solve a problem because we are looking at it, or thinking about it, the wrong way.

problem :  a situation in which we are in an initial state, have a desired goal state, and there is a number of possible intermediate states (i.e., there is no obvious way to get from the initial to the goal state)

problem representation :  noticing, comprehending and forming a mental conception of a problem

Defining and Mentally Representing Problems in Order to Solve Them

So, the main obstacle to solving a problem is that we do not clearly understand exactly what the problem is. Recall the problem with Mary’s daughter always being late. One way to represent, or to think about, this problem is that she is being defiant. She refuses to get ready in time. This type of representation or definition suggests a particular type of solution. Another way to think about the problem, however, is to consider the possibility that she is simply being sidetracked by interesting diversions. This different conception of what the problem is (i.e., different representation) suggests a very different solution strategy. For example, if Mary defines the problem as defiance, she may be tempted to solve the problem using some kind of coercive tactics, that is, to assert her authority as her mother and force her to listen. On the other hand, if Mary defines the problem as distraction, she may try to solve it by simply removing the distracting objects.

As you might guess, when a problem is represented one way, the solution may seem very difficult, or even impossible. Seen another way, the solution might be very easy. For example, consider the following problem (from Nasar, 1998):

Two bicyclists start 20 miles apart and head toward each other, each going at a steady rate of 10 miles per hour. At the same time, a fly that travels at a steady 15 miles per hour starts from the front wheel of the southbound bicycle and flies to the front wheel of the northbound one, then turns around and flies to the front wheel of the southbound one again, and continues in this manner until he is crushed between the two front wheels. Question: what total distance did the fly cover?

Please take a few minutes to try to solve this problem.

Most people represent this problem as a question about a fly because, well, that is how the question is asked. The solution, using this representation, is to figure out how far the fly travels on the first leg of its journey, then add this total to how far it travels on the second leg of its journey (when it turns around and returns to the first bicycle), then continue to add the smaller distance from each leg of the journey until you converge on the correct answer. You would have to be quite skilled at math to solve this problem, and you would probably need some time and pencil and paper to do it.

If you consider a different representation, however, you can solve this problem in your head. Instead of thinking about it as a question about a fly, think about it as a question about the bicycles. They are 20 miles apart, and each is traveling 10 miles per hour. How long will it take for the bicycles to reach each other? Right, one hour. The fly is traveling 15 miles per hour; therefore, it will travel a total of 15 miles back and forth in the hour before the bicycles meet. Represented one way (as a problem about a fly), the problem is quite difficult. Represented another way (as a problem about two bicycles), it is easy. Changing your representation of a problem is sometimes the best—sometimes the only—way to solve it.

Unfortunately, however, changing a problem’s representation is not the easiest thing in the world to do. Often, problem solvers get stuck looking at a problem one way. This is called  fixation . Most people who represent the preceding problem as a problem about a fly probably do not pause to reconsider, and consequently change, their representation. A parent who thinks her daughter is being defiant is unlikely to consider the possibility that her behavior is far less purposeful.

Problem-solving fixation was examined by a group of German psychologists called Gestalt psychologists during the 1930’s and 1940’s. Karl Dunker, for example, discovered an important type of failure to take a different perspective called  functional fixedness . Imagine being a participant in one of his experiments. You are asked to figure out how to mount two candles on a door and are given an assortment of odds and ends, including a small empty cardboard box and some thumbtacks. Perhaps you have already figured out a solution: tack the box to the door so it forms a platform, then put the candles on top of the box. Most people are able to arrive at this solution. Imagine a slight variation of the procedure, however. What if, instead of being empty, the box had matches in it? Most people given this version of the problem do not arrive at the solution given above. Why? Because it seems to people that when the box contains matches, it already has a function; it is a matchbox. People are unlikely to consider a new function for an object that already has a function. This is functional fixedness.

Mental set is a type of fixation in which the problem solver gets stuck using the same solution strategy that has been successful in the past, even though the solution may no longer be useful. It is commonly seen when students do math problems for homework. Often, several problems in a row require the reapplication of the same solution strategy. Then, without warning, the next problem in the set requires a new strategy. Many students attempt to apply the formerly successful strategy on the new problem and therefore cannot come up with a correct answer.

The thing to remember is that you cannot solve a problem unless you correctly identify what it is to begin with (initial state) and what you want the end result to be (goal state). That may mean looking at the problem from a different angle and representing it in a new way. The correct representation does not guarantee a successful solution, but it certainly puts you on the right track.

A bit more optimistically, the Gestalt psychologists discovered what may be considered the opposite of fixation, namely  insight . Sometimes the solution to a problem just seems to pop into your head. Wolfgang Kohler examined insight by posing many different problems to chimpanzees, principally problems pertaining to their acquisition of out-of-reach food. In one version, a banana was placed outside of a chimpanzee’s cage and a short stick inside the cage. The stick was too short to retrieve the banana, but was long enough to retrieve a longer stick also located outside of the cage. This second stick was long enough to retrieve the banana. After trying, and failing, to reach the banana with the shorter stick, the chimpanzee would try a couple of random-seeming attempts, react with some apparent frustration or anger, then suddenly rush to the longer stick, the correct solution fully realized at this point. This sudden appearance of the solution, observed many times with many different problems, was termed insight by Kohler.

Lest you think it pertains to chimpanzees only, Karl Dunker demonstrated that children also solve problems through insight in the 1930s. More importantly, you have probably experienced insight yourself. Think back to a time when you were trying to solve a difficult problem. After struggling for a while, you gave up. Hours later, the solution just popped into your head, perhaps when you were taking a walk, eating dinner, or lying in bed.

fixation :  when a problem solver gets stuck looking at a problem a particular way and cannot change his or her representation of it (or his or her intended solution strategy)

functional fixedness :  a specific type of fixation in which a problem solver cannot think of a new use for an object that already has a function

mental set :  a specific type of fixation in which a problem solver gets stuck using the same solution strategy that has been successful in the past

insight :  a sudden realization of a solution to a problem

Solving Problems by Trial and Error

Correctly identifying the problem and your goal for a solution is a good start, but recall the psychologist’s definition of a problem: it includes a set of possible intermediate states. Viewed this way, a problem can be solved satisfactorily only if one can find a path through some of these intermediate states to the goal. Imagine a fairly routine problem, finding a new route to school when your ordinary route is blocked (by road construction, for example). At each intersection, you may turn left, turn right, or go straight. A satisfactory solution to the problem (of getting to school) is a sequence of selections at each intersection that allows you to wind up at school.

If you had all the time in the world to get to school, you might try choosing intermediate states randomly. At one corner you turn left, the next you go straight, then you go left again, then right, then right, then straight. Unfortunately, trial and error will not necessarily get you where you want to go, and even if it does, it is not the fastest way to get there. For example, when a friend of ours was in college, he got lost on the way to a concert and attempted to find the venue by choosing streets to turn onto randomly (this was long before the use of GPS). Amazingly enough, the strategy worked, although he did end up missing two out of the three bands who played that night.

Trial and error is not all bad, however. B.F. Skinner, a prominent behaviorist psychologist, suggested that people often behave randomly in order to see what effect the behavior has on the environment and what subsequent effect this environmental change has on them. This seems particularly true for the very young person. Picture a child filling a household’s fish tank with toilet paper, for example. To a child trying to develop a repertoire of creative problem-solving strategies, an odd and random behavior might be just the ticket. Eventually, the exasperated parent hopes, the child will discover that many of these random behaviors do not successfully solve problems; in fact, in many cases they create problems. Thus, one would expect a decrease in this random behavior as a child matures. You should realize, however, that the opposite extreme is equally counterproductive. If the children become too rigid, never trying something unexpected and new, their problem solving skills can become too limited.

Effective problem solving seems to call for a happy medium that strikes a balance between using well-founded old strategies and trying new ground and territory. The individual who recognizes a situation in which an old problem-solving strategy would work best, and who can also recognize a situation in which a new untested strategy is necessary is halfway to success.

Solving Problems with Algorithms and Heuristics

For many problems there is a possible strategy available that will guarantee a correct solution. For example, think about math problems. Math lessons often consist of step-by-step procedures that can be used to solve the problems. If you apply the strategy without error, you are guaranteed to arrive at the correct solution to the problem. This approach is called using an  algorithm , a term that denotes the step-by-step procedure that guarantees a correct solution. Because algorithms are sometimes available and come with a guarantee, you might think that most people use them frequently. Unfortunately, however, they do not. As the experience of many students who have struggled through math classes can attest, algorithms can be extremely difficult to use, even when the problem solver knows which algorithm is supposed to work in solving the problem. In problems outside of math class, we often do not even know if an algorithm is available. It is probably fair to say, then, that algorithms are rarely used when people try to solve problems.

Because algorithms are so difficult to use, people often pass up the opportunity to guarantee a correct solution in favor of a strategy that is much easier to use and yields a reasonable chance of coming up with a correct solution. These strategies are called  problem solving heuristics . Similar to what you saw in section 6.2 with reasoning heuristics, a problem solving heuristic is a shortcut strategy that people use when trying to solve problems. It usually works pretty well, but does not guarantee a correct solution to the problem. For example, one problem solving heuristic might be “always move toward the goal” (so when trying to get to school when your regular route is blocked, you would always turn in the direction you think the school is). A heuristic that people might use when doing math homework is “use the same solution strategy that you just used for the previous problem.”

By the way, we hope these last two paragraphs feel familiar to you. They seem to parallel a distinction that you recently learned. Indeed, algorithms and problem-solving heuristics are another example of the distinction between Type 1 thinking and Type 2 thinking.

Although it is probably not worth describing a large number of specific heuristics, two observations about heuristics are worth mentioning. First, heuristics can be very general or they can be very specific, pertaining to a particular type of problem only. For example, “always move toward the goal” is a general strategy that you can apply to countless problem situations. On the other hand, “when you are lost without a functioning gps, pick the most expensive car you can see and follow it” is specific to the problem of being lost. Second, all heuristics are not equally useful. One heuristic that many students know is “when in doubt, choose c for a question on a multiple-choice exam.” This is a dreadful strategy because many instructors intentionally randomize the order of answer choices. Another test-taking heuristic, somewhat more useful, is “look for the answer to one question somewhere else on the exam.”

You really should pay attention to the application of heuristics to test taking. Imagine that while reviewing your answers for a multiple-choice exam before turning it in, you come across a question for which you originally thought the answer was c. Upon reflection, you now think that the answer might be b. Should you change the answer to b, or should you stick with your first impression? Most people will apply the heuristic strategy to “stick with your first impression.” What they do not realize, of course, is that this is a very poor strategy (Lilienfeld et al, 2009). Most of the errors on exams come on questions that were answered wrong originally and were not changed (so they remain wrong). There are many fewer errors where we change a correct answer to an incorrect answer. And, of course, sometimes we change an incorrect answer to a correct answer. In fact, research has shown that it is more common to change a wrong answer to a right answer than vice versa (Bruno, 2001).

The belief in this poor test-taking strategy (stick with your first impression) is based on the  confirmation bias   (Nickerson, 1998; Wason, 1960). You first saw the confirmation bias in Module 1, but because it is so important, we will repeat the information here. People have a bias, or tendency, to notice information that confirms what they already believe. Somebody at one time told you to stick with your first impression, so when you look at the results of an exam you have taken, you will tend to notice the cases that are consistent with that belief. That is, you will notice the cases in which you originally had an answer correct and changed it to the wrong answer. You tend not to notice the other two important (and more common) cases, changing an answer from wrong to right, and leaving a wrong answer unchanged.

Because heuristics by definition do not guarantee a correct solution to a problem, mistakes are bound to occur when we employ them. A poor choice of a specific heuristic will lead to an even higher likelihood of making an error.

algorithm :  a step-by-step procedure that guarantees a correct solution to a problem

problem solving heuristic :  a shortcut strategy that we use to solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

confirmation bias :  people’s tendency to notice information that confirms what they already believe

An Effective Problem-Solving Sequence

You may be left with a big question: If algorithms are hard to use and heuristics often don’t work, how am I supposed to solve problems? Robert Sternberg (1996), as part of his theory of what makes people successfully intelligent (Module 8) described a problem-solving sequence that has been shown to work rather well:

  • Identify the existence of a problem.  In school, problem identification is often easy; problems that you encounter in math classes, for example, are conveniently labeled as problems for you. Outside of school, however, realizing that you have a problem is a key difficulty that you must get past in order to begin solving it. You must be very sensitive to the symptoms that indicate a problem.
  • Define the problem.  Suppose you realize that you have been having many headaches recently. Very likely, you would identify this as a problem. If you define the problem as “headaches,” the solution would probably be to take aspirin or ibuprofen or some other anti-inflammatory medication. If the headaches keep returning, however, you have not really solved the problem—likely because you have mistaken a symptom for the problem itself. Instead, you must find the root cause of the headaches. Stress might be the real problem. For you to successfully solve many problems it may be necessary for you to overcome your fixations and represent the problems differently. One specific strategy that you might find useful is to try to define the problem from someone else’s perspective. How would your parents, spouse, significant other, doctor, etc. define the problem? Somewhere in these different perspectives may lurk the key definition that will allow you to find an easier and permanent solution.
  • Formulate strategy.  Now it is time to begin planning exactly how the problem will be solved. Is there an algorithm or heuristic available for you to use? Remember, heuristics by their very nature guarantee that occasionally you will not be able to solve the problem. One point to keep in mind is that you should look for long-range solutions, which are more likely to address the root cause of a problem than short-range solutions.
  • Represent and organize information.  Similar to the way that the problem itself can be defined, or represented in multiple ways, information within the problem is open to different interpretations. Suppose you are studying for a big exam. You have chapters from a textbook and from a supplemental reader, along with lecture notes that all need to be studied. How should you (represent and) organize these materials? Should you separate them by type of material (text versus reader versus lecture notes), or should you separate them by topic? To solve problems effectively, you must learn to find the most useful representation and organization of information.
  • Allocate resources.  This is perhaps the simplest principle of the problem solving sequence, but it is extremely difficult for many people. First, you must decide whether time, money, skills, effort, goodwill, or some other resource would help to solve the problem Then, you must make the hard choice of deciding which resources to use, realizing that you cannot devote maximum resources to every problem. Very often, the solution to problem is simply to change how resources are allocated (for example, spending more time studying in order to improve grades).
  • Monitor and evaluate solutions.  Pay attention to the solution strategy while you are applying it. If it is not working, you may be able to select another strategy. Another fact you should realize about problem solving is that it never does end. Solving one problem frequently brings up new ones. Good monitoring and evaluation of your problem solutions can help you to anticipate and get a jump on solving the inevitable new problems that will arise.

Please note that this as  an  effective problem-solving sequence, not  the  effective problem solving sequence. Just as you can become fixated and end up representing the problem incorrectly or trying an inefficient solution, you can become stuck applying the problem-solving sequence in an inflexible way. Clearly there are problem situations that can be solved without using these skills in this order.

Additionally, many real-world problems may require that you go back and redefine a problem several times as the situation changes (Sternberg et al. 2000). For example, consider the problem with Mary’s daughter one last time. At first, Mary did represent the problem as one of defiance. When her early strategy of pleading and threatening punishment was unsuccessful, Mary began to observe her daughter more carefully. She noticed that, indeed, her daughter’s attention would be drawn by an irresistible distraction or book. Fresh with a re-representation of the problem, she began a new solution strategy. She began to remind her daughter every few minutes to stay on task and remind her that if she is ready before it is time to leave, she may return to the book or other distracting object at that time. Fortunately, this strategy was successful, so Mary did not have to go back and redefine the problem again.

Pick one or two of the problems that you listed when you first started studying this section and try to work out the steps of Sternberg’s problem solving sequence for each one.

a mental representation of a category of things in the world

an assumption about the truth of something that is not stated. Inferences come from our prior knowledge and experience, and from logical reasoning

knowledge about one’s own cognitive processes; thinking about your thinking

individuals who are less competent tend to overestimate their abilities more than individuals who are more competent do

Thinking like a scientist in your everyday life for the purpose of drawing correct conclusions. It entails skepticism; an ability to identify biases, distortions, omissions, and assumptions; and excellent deductive and inductive reasoning, and problem solving skills.

a way of thinking in which you refrain from drawing a conclusion or changing your mind until good evidence has been provided

an inclination, tendency, leaning, or prejudice

a type of reasoning in which the conclusion is guaranteed to be true any time the statements leading up to it are true

a set of statements in which the beginning statements lead to a conclusion

an argument for which true beginning statements guarantee that the conclusion is true

a type of reasoning in which we make judgments about likelihood from sets of evidence

an inductive argument in which the beginning statements lead to a conclusion that is probably true

fast, automatic, and emotional thinking

slow, effortful, and logical thinking

a shortcut strategy that we use to make judgments and solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

udging the frequency or likelihood of some event type according to how easily examples of the event can be called to mind (i.e., how available they are to memory)

judging the likelihood that something is a member of a category on the basis of how much it resembles a typical category member (i.e., how representative it is of the category)

a situation in which we are in an initial state, have a desired goal state, and there is a number of possible intermediate states (i.e., there is no obvious way to get from the initial to the goal state)

noticing, comprehending and forming a mental conception of a problem

when a problem solver gets stuck looking at a problem a particular way and cannot change his or her representation of it (or his or her intended solution strategy)

a specific type of fixation in which a problem solver cannot think of a new use for an object that already has a function

a specific type of fixation in which a problem solver gets stuck using the same solution strategy that has been successful in the past

a sudden realization of a solution to a problem

a step-by-step procedure that guarantees a correct solution to a problem

The tendency to notice and pay attention to information that confirms your prior beliefs and to ignore information that disconfirms them.

a shortcut strategy that we use to solve problems. Although they are easy to use, they do not guarantee correct judgments and solutions

Introduction to Psychology Copyright © 2020 by Ken Gray; Elizabeth Arnott-Hill; and Or'Shaundra Benson is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

Radford University

Center for Innovation and Analytics

Departments

  • Academic Affairs
  • Audit and Advisory Services
  • Finance and Administration
  • Human Resources
  • Information Technology
  • Office of the President
  • Student Affairs
  • University Advancement
  • University Relations
  • Other Offices and Departments
  • About the Center for Innovation and Analytics
  • Areas of Growth in Analytics
  • Analytics Career Preparation
  • Microsoft Office Specialist Certifications
  • Analytics Courses
  • Executives in Residence in Analytics
  • Success Stories
  • Analytics Events
  • SAS Joint Graduate Certificate in Business Analytics
  • Analytics Resources
  • Online SAS Joint Graduate Certificate in Business Analytics Certificate
  • Innovation Courses
  • COBE BB&T Innovation Competition
  • The Background to Support the Center
  • What the Center Provides
  • Skills Required by Employers
  • Directors' Bios

P.O. Box 6953 Radford, VA 24142 Kyle Hall Suite 231 540.831.5513 cia@radford.edu cia-analytics@radford.edu cia-innovation@radford.edu

Dr. Wil Stanton, Director wstanton@radford.edu cia-analytics@radford.edu

Vicki Perkins, Administrative Assistant vperkins1@radford.edu

Problem Solving, Critical Thinking, and Analytical Reasoning Skills Sought by Employers

In this section:

Problem Solving

  • Critical Thinking

Analytical Reasoning

View the content on this page in a Word document.

Critical thinking, analytical reasoning, and problem-solving skills are required to perform well on tasks expected by employers. 1 Having good problem-solving and critical thinking skills can make a major difference in a person’s career. 2

Every day, from an entry-level employee to the Chairman of the Board, problems need to be resolved. Whether solving a problem for a client (internal or external), supporting those who are solving problems, or discovering new problems to solve, the challenges faced may be simple/complex or easy/difficult.

A fundamental component of every manager's role is solving problems. So, helping students become a confident problem solver is critical to their success; and confidence comes from possessing an efficient and practiced problem-solving process.

Employers want employees with well-founded skills in these areas, so they ask four questions when assessing a job candidate 3 :

  • Evaluation of information: How well does the applicant assess the quality and relevance of information?
  • Analysis and Synthesis of information: How well does the applicant analyze and synthesize data and information?
  • Drawing conclusions: How well does the applicant form a conclusion from their analysis?
  • Acknowledging alternative explanations/viewpoints: How well does the applicant consider other options and acknowledge that their answer is not the only perspective?

When an employer says they want employees who are good at solving complex problems, they are saying they want employees possessing the following skills:

  • Analytical Thinking — A person who can use logic and critical thinking to analyze a situation.
  • Critical Thinking – A person who makes reasoned judgments that are logical and well thought out.
  • Initiative — A person who will step up and take action without being asked. A person who looks for opportunities to make a difference.
  • Creativity — A person who is an original thinker and have the ability to go beyond traditional approaches.
  • Resourcefulness — A person who will adapt to new/difficult situations and devise ways to overcome obstacles.
  • Determination — A person who is persistent and does not give up easily.
  • Results-Oriented — A person whose focus is on getting the problem solved.

Two of the major components of problem-solving skills are critical thinking and analytical reasoning.  These two skills are at the top of skills required of applicants by employers.

- Return to top of page -

Critical Thinking 4

“Mentions of critical thinking in job postings have doubled since 2009, according to an analysis by career-search site Indeed.com.” 5 Making logical and reasoned judgments that are well thought out is at the core of critical thinking. Using critical thinking an individual will not automatically accept information or conclusions drawn from to be factual, valid, true, applicable or correct. “When students are taught how to use critical thinking to tap into their creativity to solve problems, they are more successful than other students when they enter management-training programs in large corporations.” 6

A strong applicant should question and want to make evidence-based decisions. Employers want employees who say things such as: “Is that a fact or just an opinion? Is this conclusion based on data or gut feel?” and “If you had additional data could there be alternative possibilities?” Employers seek employees who possess the skills and abilities to conceptualize, apply, analyze, synthesize, and evaluate information to reach an answer or conclusion.

Employers require critical thinking in employees because it increases the probability of a positive business outcome. Employers want employees whose thinking is intentional, purposeful, reasoned, and goal directed.

Recruiters say they want applicants with problem-solving and critical thinking skills. They “encourage applicants to prepare stories to illustrate their critical-thinking prowess, detailing, for example, the steps a club president took to improve attendance at weekly meetings.” 7

Employers want students to possess analytical reasoning/thinking skills — meaning they want to hire someone who is good at breaking down problems into smaller parts to find solutions. “The adjective, analytical, and the related verb analyze can both be traced back to the Greek verb, analyein — ‘to break up, to loosen.’ If a student is analytical, you are good at taking a problem or task and breaking it down into smaller elements in order to solve the problem or complete the task.” 9

Analytical reasoning connotes a person's general aptitude to arrive at a logical conclusion or solution to given problems. Just as with critical thinking, analytical thinking critically examines the different parts or details of something to fully understand or explain it. Analytical thinking often requires the person to use “cause and effect, similarities and differences, trends, associations between things, inter-relationships between the parts, the sequence of events, ways to solve complex problems, steps within a process, diagraming what is happening.” 10

Analytical reasoning is the ability to look at information and discern patterns within it. “The pattern could be the structure the author of the information uses to structure an argument, or trends in a large data set. By learning methods of recognizing these patterns, individuals can pull more information out of a text or data set than someone who is not using analytical reasoning to identify deeper patterns.” 11

Employers want employees to have the aptitude to apply analytical reasoning to problems faced by the business. For instance, “a quantitative analyst can break down data into patterns to discern information, such as if a decrease in sales is part of a seasonal pattern of ups and downs or part of a greater downward trend that a business should be worried about. By learning to recognize these patterns in both numbers and written arguments, an individual gains insights into the information that someone who simply takes the information at face value will miss.” 12

Managers with excellent analytical reasoning abilities are considered good at, “evaluating problems, analyzing them from more than one angle and finding a solution that works best in the given circumstances”. 13 Businesses want managers who can apply analytical reasoning skills to meet challenges and keep a business functioning smoothly

A person with good analytical reasoning and pattern recognition skills can see trends in a problem much easier than anyone else.

Get 25% off all test packages.

Get 25% off all test packages!

Click below to get 25% off all test packages.

Logical Reasoning Tests

  • 100 questions

Logical reasoning tests are a type of psychometric test used to measure your problem-solving skills. They come in various forms, but all have the underlying purpose of assessing your logical aptitude and your ability to draw conclusions from a given set of information.

What is a logical reasoning test?

A logical reasoning test is an assessment that measures your ability to interpret information, apply logic to solve problems and draw relevant conclusions. It is typically non-verbal and in a multiple-choice format, and requires the use of rules and deduction to reach answers, rather than prior knowledge.

That said, logical reasoning is actually an umbrella term for multiple types of assessment, and you may find you’re asked to take any one of the following five test types as part of a job application.

Deductive reasoning

Commonly presented as a series of word problems, deductive reasoning tests require you to apply top-down-logic; that is, you must draw the right conclusion from a set of given premises.

Typically, you’ll be presented with a short paragraph, or stimulus, detailing an argument, scenario or a number of stated facts, and a set of possible answers. Only one of these answers can be true, based on the evidence provided.

You may also be given a conclusive statement and asked to decide if it is true or false, or if there’s insufficient information to conclude either way.

Inductive reasoning

Unlike deductive reasoning, inductive reasoning tests ask you to make general inferences – probable conclusions based on a set of information, rather than unquestionable outcomes.

This is most often done through the use of shapes, patterns, sequences and diagrams.

You’ll need to quickly identify relationships and rules, then apply these to find the most logical answer from the multiple-choice options. This could be identifying the odd one out, filling in the missing part of a pattern, or finding the next part of a sequence.

Diagrammatic reasoning

Similar to inductive reasoning, diagrammatic reasoning tests offer visual representations of a problem and require you to make logical connections to draw a conclusion.

Questions often take the form of a diagram with inputs and outputs, and you’ll be required to select which processes from a list of operators would achieve the documented effect.

You may also be presented with sets of abstract sequences, given a standalone visual, and asked to select which set it belongs to.

Abstract reasoning

Abstract reasoning tests are essentially inductive and/or diagrammatic reasoning tests under another name.

They too require you to find relationships and rules between visual sequences, then apply these to select the correct image from multiple options, be it a missing part or a continuation of the sequence in question.

Critical reasoning

Critical reasoning tests are more akin to deductive reasoning tests, in that you’ll be dealing with word-based scenarios, arguments, evidence and conclusions.

These tests tend to evaluate a range of skills. Argument analysis is common, in which a question is posed, and a yes/no answer given with a supporting statement. You’ll need to decide whether the statement is a strong or weak argument.

Other question types involve scenarios and statements from which you’ll be asked to make assumptions, deductions and inferences based on the evidence provided.

Critical reasoning tests are most commonly used in sectors where evidence-based judgement is an everyday requirement, such as law.

Why do employers use logical reasoning tests?

As with any form of psychometric assessment, employers use logical reasoning tests as a way to filter applicants, most commonly in the pre-interview stages of selection.

Logic forms a fundamental part of day-to-day decision making. Our reasoning capabilities determine how effectively we interpret the world around us, and how we use what we know to be fact to inform our choices. As such, logical reasoning is a vital part of many job functions.

In administering a logical reasoning test, employers are evaluating how well you’re likely to perform tasks like strategy development, risk assessment and forecasting, as well as general problem solving.

Common logical reasoning test publishers

Below are listed five of the most widely used publishers of logical reasoning tests, each of which has its own approach to this type of assessment.

SHL publishes and administers both inductive and deductive reasoning tests, the lengths of which vary depending on the level of role applied for. Typically though, they last no longer than 25 minutes and follow a standard format.

Kenexa’s logical reasoning test focuses on inductive or abstract reasoning, with candidates required to assess and manipulate shapes and sequences. It also has a deductive reasoning test, which it refers to as verbal reasoning.

Cut-e offers both inductive and deductive reasoning tests, with individual variations of each. The layout of Cut-e’s tests is known to be somewhat different to other publishers, so if you’re taking one be sure to practice specifically for this format.

As one of the best-known publishers of psychometric and aptitude assessments, Saville’s logical reasoning tests are widely used. They’re offered as either abstract or diagrammatic reasoning and have a time limit of around 20 to 25 minutes.

Logical reasoning tests from Talent Q are adaptive, which means the difficulty rating of a question is related to your performance on the question prior. Do well initially, and they’ll get harder. Struggle, and they’ll become a little easier.

How to prepare for logical reasoning tests

The best way to prepare for a logical reasoning test of any description is to train your brain to think more critically – and that means practice.

Try making puzzles a part of your daily routine or use brain-training apps in your downtime. If you’re preparing for a deductive or critical thinking test , take an analytical approach to reading the daily news. Instead of simply taking things on face value, ask yourself questions based on the evidence provided, and whether or not it’s enough to draw solid conclusions.

And make sure you take plenty of practice tests. This will help you understand how to answer logical reasoning tests , and will make you familiar with many of the common relationships found in abstract sequences, including orientation, shading, rotations and reflections.

If you’re struggling to identify relevant rules, work backwards from the answer. The better you understand where and how certain rules apply, the more picking them out will become second nature.

As you progress with your practice tests, start taking them under exam conditions, including setting yourself a time limit. Pacing is a key skill in logical reasoning tests, as your score will not only indicate how many correct answers you gave, but how long it took you to answer each question.

Lastly, be sure to practice the right type of test. Ask your prospective employer which of the five types of logical reasoning assessment you’ll be sitting, and if possible, which test provider they use. This will allow you to target your preparation to the specific test format you’ll face on assessment day.

Prepare yourself for leading employers

BBC

Free example logical reasoning questions

Below you’ll find example questions for the different types of logical reasoning test. Answers to each are given below the set of questions.

For further practice, check out our free logical reasoning test questions and answers .

Deductive reasoning test

All footballers are fit and healthy.

All famous sports players are footballers.

Given that the above is true, which of the following is the logical deduction?

  • All footballers are famous sports people
  • All famous people are fit and healthy
  • All famous sports players are fit and healthy
  • All fit and healthy people are footballers
  • All football players are men

Inductive reasoning test

inductive reasoning practice question

How many triangles will be in the 6th shape?

Diagrammatic reasoning test

diagrammatic reasoning practice questions

In the grid, one box is missing. You must work out what rules are being applied in the other boxes in order to work out which of boxes A to F will complete the grid.

Abstract reasoning test

abstract reasoning practice questions

Which of the boxes comes next in the sequence?

Using deductive reasoning, the only logical answer is 3. To get to this answer, you need to simplify the given facts. All famous sports players are footballers, and all footballers are fit and healthy.

  • We can’t deduce that all footballers are famous sports people, as we haven’t got that information.
  • We can’t deduce that all famous people are fit and healthy, because the fact is about famous sports people.
  • This is the logical answer.
  • This information is not given; all footballers are fit and healthy but we can’t logically link that all fit and healthy people are footballers.
  • This is obviously incorrect, as gender is not mentioned at all in the question.

The number of triangles is increasing by 2 as you move along the sequence. I you continue to add 2 until you reach the 6th shape you reach 14, so the answer is C).

In the question the key rule is that the number of ‘star’ shapes in the central column must always equal the number of double circle shapes.

If there are no star shapes there should be no circle shapes. If there are three star shapes, there should be three circle shapes. Option F is the only one that abides by this rule.

Please note: shapes are not in a set position within this sequence. It is merely the presence of the shapes that is important. 1. There are always two squares in the frame. 2. There are always two circles in the frame. 3. There is always one triangle in the frame. So the answer is D).

logical reasoning and problem solving skills

After using the platform for two weeks, I’ve never felt more prepared for an Aptitude test.

Logical Reasoning Tests FAQs

How are logical reasoning tests scored.

Logical reasoning tests are scored comparatively. That is to say, you’ll receive one mark for each correct answer, and your total score will be compared to the average results of other test-takers. Different employers may assess your results in different ways. Some will look only at your raw score against an average benchmark, while others may also consider your pace.

What are logical reasoning tests used for?

No matter the type of logical reasoning test used, you’re being assessed on your problem-solving and critical thinking skills. Employers are trying to determine if you have the required ability to interpret information, identify patterns and relationships, and draw solid conclusions. These are skills used on a daily basis in many job roles, so logical reasoning tests are widely used.

How is logical thinking measured?

Logical reasoning tests give a good indication of your lateral thinking skills by measuring your ability to analyse and interpret information to make evidence-based decisions – be they inferences, assumptions or unquestionable conclusions.

Why is logical reasoning important?

Logical reasoning is important in work-based environments because it is this skill set that allows you to work through many everyday business problems and come to the right resolution. Logical thinkers make decisions based on what they know to be true, rather than gut feeling; set achievable goals based on past performance; and approach complex problems in a systematic manner.

Where can I practice logical reasoning tests?

You can find practice tests for all types of logical reasoning assessments on our website, along with detailed answer explanations and guides. You can also find practice tests online from individual publishers which will help you get to grips with specific formats and time constraints.

Which employers use logical reasoning tests?

Logical reasoning tests are commonly used for managerial-level roles and above in many corporate job sectors, including law, investment banking and consultancy, as well as human resources, customer service and market research. It’s also likely you’ll be required to sit some form of logical reasoning test for acceptance onto a graduate scheme with many larger employers.

Neuroworx

Hire better talent

At Neuroworx we help companies build perfect teams

Join picked

Logical Reasoning Tests Tips

1 read each question carefully.

It’s vital you understand exactly what is being asked of you, so be sure to read every question thoroughly. There may well be distractors in the multiple-choice options; picking one of these because you’ve misinterpreted the question is a common error.

2 Analyse the stimulus

In deductive or critical reasoning tests, it’s important to fully digest the stimulus before drawing your conclusion. Again, a simple misunderstanding can be the difference between scoring or missing out on a mark, so make sure you’re aware of all the evidence presented to you.

3 Work out your answer before looking at the options

When working with abstract sequences or patterns, try to get an idea in your head of what the missing piece or next part of the sequence is likely to be, before you look at the multiple-choice options. This will help you zone in on the right response, rather than get distracted by irrelevant choices.

4 Make notes

There may be several relationships in any given sequence, and in diagrammatic reasoning tests you’ll need to be aware of multiple processes. Make notes as you go through to keep track of your thought process. It will help you to work methodically and avoid confusion.

5 Pay attention to pacing

You only have a set amount of time to work through all the questions, so be sure to pace yourself. Typically, problems become more complex as the test progresses, so aim to spend less time on questions at the start. Good pacing takes practice. You want to work quickly but not to the detriment of your accuracy.

6 Don't panic

Logical reasoning tests can be a little daunting if you’re not used to them but remember, we apply logic everyday without even realising it. Stay calm and remind yourself that the steps you need to take are familiar to you, it’s just that the problem you’re solving is presented in an unfamiliar way.

Enjoy what you’ve read? Let others know!

  • Share on whatsapp
  • Share on linkedin
  • Share on twitter
  • Share on facebook
  • Share via email

Logical Reasoning Video Tutorials

logical reasoning and problem solving skills

Mirror Images

logical reasoning and problem solving skills

Rotated Views

Try logical reasoning tests for free, logical reasoning 01.

20 Questions | 20 Minutes

Logical Reasoning 02

Logical reasoning 03, improve your scores with our intelligent learning system, prepare for your logical reasoning test.

Immediate access. Cancel anytime.

  • 30 Numerical reasoning tests
  • 30 Verbal reasoning tests
  • 30 Diagrammatic reasoning tests
  • 30 Situational judgement tests
  • 34 Publisher packages e.g. Watson Glaser
  • 252 Employer packages e.g. HSBC
  • 29 Extra packages e.g Mechanical
  • Dashboard performance tracking
  • Full solutions and explanations
  • Tips, tricks, guides and resources
  • Access to free tests
  • Basic performance tracking
  • Solutions & explanations
  • Tips and resources

Reviews of our Logical Reasoning tests

What our customers say about our Logical Reasoning tests

South Africa

October 23, 2023

Fun & challenging!

I enjoyed the variety that this test offered. I would have preferred instant, question-by-question feedback over feedback at the end.

TheReal MacBen

Philippines

October 14, 2023

The varying patterns of the figures in each box, and what could be the next chain in that pattern.

I like how the test contained fun and interesting questions that needed logical thinking. However, it is not as complex as one test I answered, so the website should give an option of difficulty in tests.

MARTINE METIEKAM

September 26, 2023

Interesting

I have difficulty identifying the sequence. Honestly, I am not very familiar with the test. Thank you.

Andreas Karlsson

September 15, 2023

I found some of the patterns challenging at first but I do love to solve these little puzzles and recognize the patterns within

United States of America

September 10, 2023

Take one peice at a time

each task was a test to see if you could follow the pattern, some were difficult but it was a nice brain teaser.

September 02, 2023

Quick access to test, without any unnecessary sale propositions

I should not have to create an account to just take a sample test. I am happy to make an account once I take 1 or 2 tests and see whether I want to create an account

Paul Kitchener

United Kingdom

August 29, 2023

Good prep for recruitment test

I liked that I could skip a question and come back to it if I found it difficult under the time limit

Nkosingiphile Nzimande

August 22, 2023

Tricky: Thinking out of the box is key

I like that it is a simple test but if you analyze too much you might get the answers wrong, I kind of felt like I didn’t understand what was going on until the 3rd question.

Daniel Nelson

August 21, 2023

Challenging but fun

I love these tests, not too difficult but hard enough to be able to work through to get your answer,

Talha Iftikhar

August 03, 2023

Good level of test

I like the website and the construction of different questions. The level of free evaluation is quite testing and good.

By using our website you agree with our Cookie Policy.

loading

How it works

For Business

Join Mind Tools

Article • 8 min read

Critical Thinking

Developing the right mindset and skills.

By the Mind Tools Content Team

We make hundreds of decisions every day and, whether we realize it or not, we're all critical thinkers.

We use critical thinking each time we weigh up our options, prioritize our responsibilities, or think about the likely effects of our actions. It's a crucial skill that helps us to cut out misinformation and make wise decisions. The trouble is, we're not always very good at it!

In this article, we'll explore the key skills that you need to develop your critical thinking skills, and how to adopt a critical thinking mindset, so that you can make well-informed decisions.

What Is Critical Thinking?

Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well.

Collecting, analyzing and evaluating information is an important skill in life, and a highly valued asset in the workplace. People who score highly in critical thinking assessments are also rated by their managers as having good problem-solving skills, creativity, strong decision-making skills, and good overall performance. [1]

Key Critical Thinking Skills

Critical thinkers possess a set of key characteristics which help them to question information and their own thinking. Focus on the following areas to develop your critical thinking skills:

Being willing and able to explore alternative approaches and experimental ideas is crucial. Can you think through "what if" scenarios, create plausible options, and test out your theories? If not, you'll tend to write off ideas and options too soon, so you may miss the best answer to your situation.

To nurture your curiosity, stay up to date with facts and trends. You'll overlook important information if you allow yourself to become "blinkered," so always be open to new information.

But don't stop there! Look for opposing views or evidence to challenge your information, and seek clarification when things are unclear. This will help you to reassess your beliefs and make a well-informed decision later. Read our article, Opening Closed Minds , for more ways to stay receptive.

Logical Thinking

You must be skilled at reasoning and extending logic to come up with plausible options or outcomes.

It's also important to emphasize logic over emotion. Emotion can be motivating but it can also lead you to take hasty and unwise action, so control your emotions and be cautious in your judgments. Know when a conclusion is "fact" and when it is not. "Could-be-true" conclusions are based on assumptions and must be tested further. Read our article, Logical Fallacies , for help with this.

Use creative problem solving to balance cold logic. By thinking outside of the box you can identify new possible outcomes by using pieces of information that you already have.

Self-Awareness

Many of the decisions we make in life are subtly informed by our values and beliefs. These influences are called cognitive biases and it can be difficult to identify them in ourselves because they're often subconscious.

Practicing self-awareness will allow you to reflect on the beliefs you have and the choices you make. You'll then be better equipped to challenge your own thinking and make improved, unbiased decisions.

One particularly useful tool for critical thinking is the Ladder of Inference . It allows you to test and validate your thinking process, rather than jumping to poorly supported conclusions.

Developing a Critical Thinking Mindset

Combine the above skills with the right mindset so that you can make better decisions and adopt more effective courses of action. You can develop your critical thinking mindset by following this process:

Gather Information

First, collect data, opinions and facts on the issue that you need to solve. Draw on what you already know, and turn to new sources of information to help inform your understanding. Consider what gaps there are in your knowledge and seek to fill them. And look for information that challenges your assumptions and beliefs.

Be sure to verify the authority and authenticity of your sources. Not everything you read is true! Use this checklist to ensure that your information is valid:

  • Are your information sources trustworthy ? (For example, well-respected authors, trusted colleagues or peers, recognized industry publications, websites, blogs, etc.)
  • Is the information you have gathered up to date ?
  • Has the information received any direct criticism ?
  • Does the information have any errors or inaccuracies ?
  • Is there any evidence to support or corroborate the information you have gathered?
  • Is the information you have gathered subjective or biased in any way? (For example, is it based on opinion, rather than fact? Is any of the information you have gathered designed to promote a particular service or organization?)

If any information appears to be irrelevant or invalid, don't include it in your decision making. But don't omit information just because you disagree with it, or your final decision will be flawed and bias.

Now observe the information you have gathered, and interpret it. What are the key findings and main takeaways? What does the evidence point to? Start to build one or two possible arguments based on what you have found.

You'll need to look for the details within the mass of information, so use your powers of observation to identify any patterns or similarities. You can then analyze and extend these trends to make sensible predictions about the future.

To help you to sift through the multiple ideas and theories, it can be useful to group and order items according to their characteristics. From here, you can compare and contrast the different items. And once you've determined how similar or different things are from one another, Paired Comparison Analysis can help you to analyze them.

The final step involves challenging the information and rationalizing its arguments.

Apply the laws of reason (induction, deduction, analogy) to judge an argument and determine its merits. To do this, it's essential that you can determine the significance and validity of an argument to put it in the correct perspective. Take a look at our article, Rational Thinking , for more information about how to do this.

Once you have considered all of the arguments and options rationally, you can finally make an informed decision.

Afterward, take time to reflect on what you have learned and what you found challenging. Step back from the detail of your decision or problem, and look at the bigger picture. Record what you've learned from your observations and experience.

Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life.

You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when your beliefs could affect your decisions or actions.

You can demonstrate a high level of critical thinking by validating your information, analyzing its meaning, and finally evaluating the argument.

Critical Thinking Infographic

See Critical Thinking represented in our infographic: An Elementary Guide to Critical Thinking .

logical reasoning and problem solving skills

You've accessed 1 of your 2 free resources.

Get unlimited access

Discover more content

Appreciative coaching.

Identifying and Developing Others' Strengths

Thinking as a Team Exercise

A Quick Icebreaker Demonstrating the Value of Working in Teams

Add comment

Comments (1)

priyanka ghogare

logical reasoning and problem solving skills

Introducing Mind Tools for Business

Mind Tools for Business is a comprehensive library of award-winning performance and management support resources.

Whether you want to increase engagement, upskill teams, or complement your existing workplace programs – this is content designed to achieve impactful results.

Sign-up to our newsletter

Subscribing to the Mind Tools newsletter will keep you up-to-date with our latest updates and newest resources.

Subscribe now

Business Skills

Personal Development

Leadership and Management

Most Popular

Newest Releases

Article av8xg61

5 Ways to Build Great Work Relationships

Article ayve4tq

How to Manage Company Growing Pains Using the Greiner Curve Infographic

Mind Tools Store

About Mind Tools Content

Discover something new today

Make change happen with kotter's 8-step change model infographic.

Infographic Transcript

Infographic

Time Management Tips Infographic

How emotionally intelligent are you.

Boosting Your People Skills

Self-Assessment

What's Your Leadership Style?

Learn About the Strengths and Weaknesses of the Way You Like to Lead

Recommended for you

Informal coaching for managers.

Knowing When to be a Coach

Business Operations and Process Management

Strategy Tools

Customer Service

Business Ethics and Values

Handling Information and Data

Project Management

Knowledge Management

Self-Development and Goal Setting

Time Management

Presentation Skills

Learning Skills

Career Skills

Communication Skills

Negotiation, Persuasion and Influence

Working With Others

Difficult Conversations

Creativity Tools

Self-Management

Work-Life Balance

Stress Management and Wellbeing

Coaching and Mentoring

Change Management

Team Management

Managing Conflict

Delegation and Empowerment

Performance Management

Leadership Skills

Developing Your Team

Talent Management

Problem Solving

Decision Making

If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

To log in and use all the features of Khan Academy, please enable JavaScript in your browser.

Course: LSAT   >   Unit 1

Getting started with logical reasoning.

  • Introduction to arguments
  • Catalog of question types
  • Types of conclusions
  • Types of evidence
  • Types of flaws
  • Identify the conclusion | Quick guide
  • Identify the conclusion | Learn more
  • Identify the conclusion | Examples
  • Identify an entailment | Quick guide
  • Identify an entailment | Learn more
  • Strongly supported inferences | Quick guide
  • Strongly supported inferences | Learn more
  • Disputes | Quick guide
  • Disputes | Learn more
  • Identify the technique | Quick guide
  • Identify the technique | Learn more
  • Identify the role | Quick guide
  • Identify the role | learn more
  • Identify the principle | Quick guide
  • Identify the principle | Learn more
  • Match structure | Quick guide
  • Match structure | Learn more
  • Match principles | Quick guide
  • Match principles | Learn more
  • Identify a flaw | Quick guide
  • Identify a flaw | Learn more
  • Match a flaw | Quick guide
  • Match a flaw | Learn more
  • Necessary assumptions | Quick guide
  • Necessary assumptions | Learn more
  • Sufficient assumptions | Quick guide
  • Sufficient assumptions | Learn more
  • Strengthen and weaken | Quick guide
  • Strengthen and weaken | Learn more
  • Helpful to know | Quick guide
  • Helpful to know | learn more
  • Explain or resolve | Quick guide
  • Explain or resolve | Learn more

Logical Reasoning overview

  • Two scored sections with 24-26 questions each
  • Logical Reasoning makes up roughly half of your total points .

Anatomy of a Logical Reasoning question

  • Passage/stimulus: This text is where we’ll find the argument or the information that forms the basis for answering the question. Sometimes there will be two arguments, if two people are presented as speakers.
  • Question/task: This text, found beneath the stimulus, poses a question. For example, it may ask what assumption is necessary to the argument, or what must be true based on the statements above.
  • Choices: You’ll be presented with five choices, of which you may select only one. You’ll see us refer to the correct choice as the “answer” throughout Khan Academy’s LSAT practice.

What can I do to tackle the Logical Reasoning section most effectively?

Dos and don’ts.

  • Don’t panic: You’re not obligated to do the questions in any order, or even to do a given question at all. Many students find success maximizing their score by skipping a select handful of questions entirely, either because they know a question will take too long to solve, or because they just don’t know how to solve it.
  • Don’t be influenced by your own views, knowledge, or experience about an issue or topic: The LSAT doesn’t require any outside expertise. All of the information that you need will be presented in the passage. When you add your own unwarranted assumptions, you’re moving away from the precision of the test’s language and toward more errors. This is one of the most common mistakes that students make on the LSAT!
  • Don’t time yourself too early on: When learning a new skill, it’s good policy to avoid introducing time considerations until you’re ready. If you were learning piano, you wouldn’t play a piece at full-speed before you’d practiced the passages very slowly, and then less slowly, and then less slowly still. Give yourself time and room to build your skill and confidence. Only when you’re feeling good about the mechanics of your approach should you introduce a stopwatch.
  • Do read with your pencil: Active reading strategies can help you better understand logical reasoning arguments and prevent you from “zoning out” while you read. Active readers like to underline or bracket an argument’s conclusion when they find it. They also like to circle keywords, such as “however”, “therefore”, “likely”, “all”, and many others that you’ll learn throughout your studies with us. If you’re reading with your pencil, you’re much less likely to wonder what you just read in the last minute.
  • Do learn all of the question types: An effective approach to a necessary assumption question is very different from an effective approach to an explain question, even though the passage will look very similar in both. In fact, the same argument passage could theoretically be used to ask you a question about the conclusion, its assumptions or vulnerabilities to criticism, its technique, the role of one of its statements, a principle it displays, or what new info might strengthen or weaken it!
  • Do spend time on the fundamentals: The temptation to churn through a high volume of questions can be strong, but strong LSAT-takers carefully and patiently learn the basics. For example, you’ll need to be able to identify a conclusion quickly and accurately before you’ll be able to progress with assumptions or flaws (identifying gaps in arguments). Similarly, a firm understanding of basic conditional reasoning will be invaluable as you approach many challenging questions. Be patient with yourself!

Want to join the conversation?

  • Upvote Button navigates to signup page
  • Downvote Button navigates to signup page
  • Flag Button navigates to signup page
  • Partnerships

The development of the reasoning brain and how to foster logical reasoning skills

The development of the reasoning brain and how to foster logical reasoning skills

Early childhood development / Effective lifelong learning / Learning mathematics

Executive summary

Learning to reason logically is necessary for the growth of critical and scientific thinking in children. Yet, both psychological and neural evidence indicates that logical reasoning is hard even for educated adults. Here, we examine the factors that scaffold the emergence of logical reasoning in children. Evidence suggests that the development of reasoning with concrete information can be accounted for by the development of both world knowledge and self-regulation. The transition from concrete to abstract reasoning, however, is a challenge for children. Children’s development of reasoning may be supported by encouraging both divergent thinking and reasoning at levels of abstraction that are just above reasoners’ current levels, alongside activities in which children reason with others.

Introduction

It is often argued that one of the most fundamental goals of education is to nurture critical thinking, that is, to teach children to employ good reasoning skills when developing their beliefs. Therefore, fostering logical reasoning should be an important goal for education: Children should learn to provide logical reasons for their opinions and should be able to distinguish between good and bad arguments. This is likely to be important for their effective exercise of citizenship as adults. For example, logical reasoning could tell you that it is unwarranted to conclude “All Muslims are terrorists” from the assertions “All the 9/11 perpetrators are Muslims” and “All the 9/11 perpetrators are terrorists.” Yet, many educated adults still draw such a conclusion, most likely because fear and bias can overcome rational thinking. This suggests that logical reasoning is hard even for educated adults, a conclusion that is supported by a wealth of psychological studies. Perhaps the most striking demonstration of the difficulty of logical reasoning was discovered by the psychologist Peter Wason in 1966 1 . Wason designed a task in which he presented participants with four playing cards, each with a letter on one side and a number on the other side. For example, the cards could be as follow:

A         B         2          3

Participants were then shown the conditional rule “If a card has the letter A on one side, then it has the number 2 on the other side.” The task consisted of selecting those cards that had to be turned over to discover whether the rule was true or false. Since Wason’s study, that task has been performed many times, and the results are always the same. Most people select either the A card alone or sometimes both the cards A and 2. However, very few adults, even highly educated, typically choose the 3 card. This is despite the fact that discovering what is on the other side of the 3 card is necessary to evaluate whether the rule is true or false (i.e., if there is an A on the other side of the 3, the rule is false). This reasoning failure has puzzled psychologists for decades because it questions the long-standing assumption that human beings are inherently rational. Why is it so hard for participants to select the 3 card? Neuroscience research suggests that it is because it is much more difficult for the brain to focus on the elements that are absent from the rule (e.g., 3) than on the elements that are present (e.g., A) 2 . Thus, selecting the 3 card requires much more extensive brain activation in several brain regions (primarily involved in attention and concentration) to overcome that tendency (see Figure 1). So, how can we get people to activate more of their reasoning brain and act more rationally on this task? One of the first ideas that comes to mind would be to teach them logic. Cheng and colleagues 3 have tested this. The researchers presented the Wason selection task to college students before and after they took a whole-semester introductory class in logic (about 40 hours of lectures). Surprisingly, they found no difference in the students’ poor performance between the beginning and the end of the semester. In other words, a whole semester of learning about logic did not help students make any less error on the task! What, then, can train the reasoning brain? To answer that question, it is interesting to turn to what we know about the development of logical reasoning in children.

Figure 1. The reasoning brain. Location of the brain regions (in red, blue, and white) that are activated when participants reason with elements that are not present in the rule in the Wason card task. Activations are displayed on pictures of the brain taken using a magnetic resonance imaging scanner. (Reproduced from Ref. 2 )

The development of concrete logical reasoning in children

It is clear that even young children can use some logical reasoning when concrete information is involved. For instance, most 6-year-olds can draw the conclusion “The person is hurt” from the statements “If the person breaks his arm, the person will be hurt” and “The person breaks his arm.” However, the reasoning abilities of young children are limited. For example, many 6-year-olds would also draw the conclusion “The person broke his arm” from the statements “If the person breaks his arm, the person will be hurt” and “The person is hurt.” This, however, is an invalid conclusion because there may be many other reasons why a person could be hurt. Children will progressively understand this and will make this type of reasoning error less and less as they get older. By the time they reach the end of elementary school, most children are able to refrain from concluding “The person broke his arm” from the statements “If the person breaks his arm, the person will be hurt” and “The person is hurt” 4 . Critically, this increased reasoning ability is mirrored by an increase in the ability to think about alternate causes for a given consequence. For example, older children are much more able than younger children to think about the many other reasons why someone would be hurt, like getting sick, breaking a leg, cutting a finger, etc. In other words, better reasoning ability with age is associated with a better ability to consider alternatives from stored knowledge. Clearly, however, children differ in terms of what they know about the world. This predicts that those who have better world knowledge and can think about more alternatives should be better reasoners than the others. And this is exactly what has been shown in several studies 4 .

Interestingly, the importance of world knowledge for reasoning has a paradoxical effect: It can make children poorer reasoners on some occasions. For example, children who can think about a lot of alternatives would be less inclined to draw the logically valid conclusion “The person will be tired” from the statements “If a person goes to sleep late, then he will be tired” and “The person goes to sleep late.” This is because a child with significant world knowledge can think of several circumstances that would make the conclusion unwarranted, such as waking up later the next day. Thus, more world knowledge needs to be associated with more ability to suppress the alternatives that might come to mind if the task requires it. This self-regulation ability relies on a part of the brain that also massively develops during childhood, i.e., the prefrontal cortex (see Figure 2). Overall, then, the development of concrete logical reasoning in children can be largely accounted for by the development of both world knowledge and self-regulation skills that are associated with the frontal cortex.

Figure 2. The prefrontal cortex. Location of the prefrontal cortex on a 3D rendering of the human brain. Polygon data were generated by Database Center for Life Science(DBCLS),  distributed under a CC-BY-SA-2.1-jp license.

From concrete to abstract reasoning

There is, however, an important difference between the reasoning skills described above and the task developed by Peter Wason about the four cards. What we just described relates to reasoning with very concrete information, whereas the card task involves reasoning with purely abstract information. Abstract reasoning is difficult because it requires one to manipulate information without any referent in the real world. Knowledge is of no help. In fact, neuroscience research indicates that abstract and concrete reasoning rely on two different parts of the brain 5 (see Figure 3). The ability to reason logically with an abstract premise is generally only found during late adolescence 4 . Transitioning from concrete to abstract reasoning may require extensive practice with concrete reasoning. With mastery, children may extract from the reasoning process abstract strategies that could be applied to abstract information. A recent study, however, suggests a trick to help facilitate this transition in children 6 . The researchers discovered that abstract reasoning in 12- to 15-year-olds is much improved when these adolescents are previously engaged in a task in which they have to reason with information that is concrete but empirically false, such as “If a shirt is rubbed with mud, then the shirt will be clean.” No such effect was observed when adolescents are asked to reason with concrete information that is empirically true, such as “If a shirt is washed with detergent, then the shirt will be clean.” Therefore, reasoning with information that contradicts what we know about the world might constitute an intermediary step in transitioning from concrete to abstract reasoning.

Figure 3. Brain regions activated when reasoning with concrete (left) and abstract (right) information. Activations are displayed on pictures of the brain taken using a magnetic resonance imaging scanner. (Reproduced from Ref. 5 )

What can we do to foster logical reasoning skills?

What, then, can we do to help foster the development of logical reasoning skills in children? The research described above suggests several potentially fruitful ways. First, it is clear that the development of concrete reasoning—the very first type of reasoning children can engage in—relies on an increased ability to think about counter-examples for a given statement. This implies that knowledge about the world is critical to the emergence of logical reasoning in children, at least when concrete information is involved. Therefore, all activities that would expand such world knowledge (e.g., reading informational books, learning new vocabulary, exploring new environments and places) are likely to be beneficial to the development of children’s reasoning skills. Second, it is important to consider that the more world knowledge a child possesses, the more he/she will need to juggle with this knowledge. For example, generating counter-examples when solving a reasoning problem will require maintaining pieces of information in memory for a short period of time, a type of memory called working memory . World knowledge can also sometimes be detrimental to reasoning and needs to be inhibited , such as when recognizing that the conclusion “The person will be tired” logically follows from the statements “If a person goes to sleep late, then he will be tired” and “The person goes to sleep late” (even if one might think of several conditions that would make the conclusion untrue based on what we know about the world). Fostering these types of self-regulation skills (working memory and inhibition) should thus be beneficial to the development of logical reasoning. Several studies suggest that these functions could be promoted by targeting children’s emotional and social development, such as in curricula involving social pretend play (requiring children to act out of character and adjusting to improvisation of others), self-discipline, orderliness, and meditation exercises 7 . Studies also indicate positive effects of various physical activities emphasizing self-control and mindfulness, such as yoga or traditional martial arts 7 . Third, studies indicate that the transition from concrete to abstract reasoning occurring around adolescence is challenging. Although more research is needed in this domain, one promising way to help this transition is by encouraging children’s thinking about alternatives with content that contradicts what they know about the world (e.g., “If a shirt is rubbed with mud, then the shirt will be clean”). In sum, as stated by Henry Markovits, “the best way to encourage the development of more abstract ways of logical reasoning is to gradually encourage both divergent thinking and reasoning at levels of abstraction that are just above reasoners’ current levels” 4 .

Fostering the development of logical reasoning should be an important goal of education. Yet, studies indicate that logical reasoning is hard even for educated adults and relies on the activation of an extensive network of brain regions. Neuroscience studies also demonstrate that reasoning with concrete information involves brain regions that qualitatively differ from those involved in reasoning with more abstract information, explaining why transitioning from concrete to abstract reasoning is challenging for children. We nonetheless reviewed here the more recent research on the development of reasoning skills and suggest several important factors that scaffold children’s reasoning abilities, such as world knowledge and self-regulation functions. On a final note, it is important to consider that logical reasoning is not something that we always do on our own, isolated from our peers. In fact, some have argued that the very function of reasoning is to argue with our peers (i.e., to find the best arguments to convince others and to evaluate arguments made by others) 8 . This idea is interesting from an educational point of view because it suggests that reasoning with others might be easier than reasoning in isolation—a hypothesis validated by several studies. For example, performance on the card task developed by Peter Wason is much higher when participants solve it as a group rather than alone 8 . Therefore, encouraging activities in which children reason with others might also be a fruitful avenue for stimulating the reasoning brain.

  • Wason, P. C. Reasoning. In New Horizons in Psychology (ed. Foss, B. M.). (Penguin: Harmondsworth, 1966).
  • Prado, J., & Noveck, I. A. Overcoming perceptual features in logical reasoning: A parametric functional magnetic resonance imaging study. J Cogn Neurosci . 19(4): 642-57 (2007).
  • Cheng, P. W. et al. Pragmatic versus syntactic approaches to training deductive reasoning. Cogn Psychol . 18(3): 293-328 (1986).
  • Markovits, H. How to develop a logical reasoner. In The Developmental Psychology of Reasoning and Decision-Making (ed. Markovits, H.) 148-164. (Psychology Press: Hove, UK, 2014).
  • Goel, V. Anatomy of deductive reasoning. Trends Cogn. Sci. (Reg. Ed.) 11(10): 435-41 (2007).
  • Markovits, H., & Lortie-Forgues, H. Conditional reasoning with false premises facilitates the transition between familiar and abstract reasoning. Child Development 82(2): 646-660 (2011).
  • Diamond, A., & Lee, K. Interventions shown to aid executive function development in children 4 to 12 years old. Science 333(6045): 959-964 (2011).
  • Mercier, H., & Sperber, D. Why do humans reason? Arguments for an argumentative theory. Behav Brain Sci . 34(2): 57-74; discussion 74-111 (2011).

For enquiries call:

+1-469-442-0620

banner-in1

What Is Logical Thinking – Significance, Components, And Examples

Home Blog Career What Is Logical Thinking – Significance, Components, And Examples

Play icon

Logical thinking skills play a significant role in developing careers because they help you reason through vital decisions, generate creative ideas, set goals, and solve problems. You may encounter multiple challenges in your life when you enter the job industry or advance your career. Therefore, need strong logical reasoning skills to solve your problems.

But you must know ‘what is logical thinking’ before you move forward or come up with solutions.

What Is Logical Thinking?

Logical thinking is your ability to think in a disciplined manner or base significant thoughts on evidence and facts. The process involves incorporating logic into an individual’s thinking abilities when analyzing a problem to devise a solution. Logical thinking may require Soft Skills Courses because it involves progressive analysis systems.

Now that you know the logical thinking meaning, you can undertake Knowledgehut Training to become probable, reasonable, and actionable with your thoughts. Many fields, such as project management , can benefit from logical thinking skills. Also, consider obtaining some accredited PMP certification programs as well.

Importance Of Logical Thinking

According to a global report , problem-solving, a critical and logical thinking aspect, is one of the top skills employers look for in job candidates. So, it explains the demand for logical thinking or reasoning abilities.

You have already gone through the logical reasoning meaning earlier. Now, it is time to understand its importance through the following points.

1. It Encourages Independent Abilities

You may require multiple demonstrations and examples in your life to learn and comprehend processes. However, prolonged and frequent demonstration systems do not work because problem-solving requires reasoning and analysis. So, you must acquire independent reasoning abilities that define logical thinking.

2. It Promotes Creativity and Innovation

Think out of the box to devise creative solutions to your problems. Here is where logical thinking comes in handy because it allows you to innovate better ideas and give a controlled sense to the events happening in your life.

3. It Helps Enhance Analytical Thinking

You weigh down all possible results and evaluate different options to ensure a favorable outcome for your decisions. Logical reasoning enables you to master multiple choice questions in various ways to get the desired answer by thinking better about the solution.

4. It Helps Strengthen the Brain

If you think about logical reasoning meaning, it involves diverse tasks that help activate various parts of your brain - memory, visual-shape memory, verbal-logic memory, etc. The process helps strengthen your brain and enables you to distinguish significant facets of life.

5. It Helps Enhance Focus

Logical thinking is one of the best ways to increase your concentration. The reasoning ability tests require your focus on problem-solving and include multiple methods and strategies to keep you hooked and develop positive self-esteem.

Ways To Improve Your Logical Thinking

Logical thinking ability definition helps you understand that you must possess this significant skill to move forward in life. So, you must improve and develop your logical thinking through proper activities and exercises. Here is a breakdown of tips to help improve your logical thinking abilities.

  • Learn from your life’s mistakes.  
  • Anticipate what lies ahead of you and other future happenings.  
  • Take complex mental tests.  
  • Stimulate your brain through multiple activities.  
  • Differentiate between observation and inferences.  
  • Try to recognize repetitive patterns like a sequence of numbers.  
  • Indulge in analytical values like critical thinking, interpreting, deciding, and concluding facts.

Logical Thinking Skills

The best way to define logical reasoning skills is the ability to focus on tasks and activities by following a chain of thought processes and relating statements to one another. The process allows you to find a logical solution to your problem.

How To Build Logical Thinking Skills?

Work on your logical thinking development to enhance your problem-solving abilities. Here is a breakdown of the techniques to help you overcome your thinking obstacles and understand what the concept of logical thinking is.

  • Do not view things from your perspective and understand other people’s opinions.  
  • Think before you start doing things by devising efficient strategies.  
  • Analyze the meaning of words and sentences carefully.  
  • Enhance your thinking skills through games and mystery books.

How To Think Logically in Five Steps?

Logical reasoning means rationalizing your thoughts and creating positive outcomes. The process combines situational awareness and the ability to regulate emotions to enable efficient decision-making. Here is how you can think logically before making decisions.

1. Take Part in Creative Activities

Creative activities like painting, writing, drawing, music, etc., help stimulate your brain and promote logical thinking. Creative thinking also helps develop problem-solving abilities to make you a better performer.

2. Practice Asking Meaningful Questions

Try asking questions regularly to gain a comprehensive perspective of the facts. It will enable you to approach problems creatively and logically and devise solutions strategically.

3. Spend Time with Other People

Try developing meaningful relationships with other people to help broaden your views and perspectives. Socializing with them will help you think logically and provide alternative viewpoints to solutions.

4. Learn New Skills

You must learn new skills frequently to sharpen your logical reasoning abilities. Take opportunities to learn as often as possible and practice your skills daily to help thoughtfully approach situations.

5. Visualize the Outcome of Your Decisions

You must consider your decisions and their impact on your future to help assess positive outcomes. Visualizing the outcome of your choices and decisions will help you strengthen your logical thinking skills.

Components Of Logical Thinking

When someone asks you what the meaning of logical thinking is, your answer should be emotional reasoning and intelligence. It means you possess self-awareness of your feelings and prevent them from affecting your decision-making process.

components of logical thinking

You must know four significant components after understanding  what  the logical thinking concept is.   

1. Deductive Reasoning

Deductive Reasoning or Deduction is a significant component of logical thinking that seeks to reach specific conclusions. The process makes it easier for you to gain a simplified understanding and indulge in rational and logical thought processes.

2. Inductive Reasoning

Inductive reasoning or induction enables you to think more logically and rely on generalizations. Your general notions depend on anecdotal experiences, facts, and personal observations of your life that are either true or false.

3. Causal Inference

Causal inference involves recognizing the change and evolvement in reasoning things to help you think logically. The process enables taking specific actions and making a logical or causal inference to reason your activities.

Analogical reasoning or analogy enables you to find the things between two different perspectives. Analogy helps you know and understand every situation to help you think logically and make rational decisions.

Example s Of Thinking Logically on Different Occasions  

What is a logical thinking example? I f you are asking yourself this question, look at the following situations for reference.  

1. Logical Thinking When You Are in Disagreement

You and your friend discuss the upcoming cricket match, and both disagree on who will be the opening batsman. You try logically reasoning out the facts and back out by stating that your friend’s prediction is correct.

2. Logical Thinking to Complete Your Work

You had planned a day out with friends for the weekend, but you got caught up with some pending work. The logical way to sort the situation would be to complete your work beforehand and head out for your getaway.

3. Logical Thinking When Making a Tough Decision

You get a good job opportunity in another city, but it makes you emotional thinking you have to leave your hometown. The logical way is to think of the opportunities awaiting you in the other place and decide to take the job.   

4. Logical Thinking When You Do Not Know the Answer

If you do not know the answer to a few questions about your recent assignment, the logical way of solving them is by approaching your teacher and asking for clarification.   

5. Other Logical Thinking Examples

Logical thinking involves reasoning skills to study problems and find rational conclusions or solutions. One of the best examples is the following situation.

You are facing some problems in the office. So, you use the available facts using your logical reasoning skills to address them.

Here is another example of logical reasoning.

You develop a fever ahead of an important meeting that you cannot miss at any cost. The logical way to solve the problem is to attend the meeting virtually instead of remaining physically present.

In Conclusion

Logical thinking is an act of analyzing situations and using reasoning abilities to study the problem and make a rational conclusion.  When you become a logical thinker, you gather all the information you can, assess the facts, and methodically decide the best way to move forward with your decision. Most people consider logical thinking an essential tool to brainstorm ideas, analyze problems, and find answers at home, workplace, or in educational institutions.

Frequently Asked Questions (FAQs)

You can consider yourself a logical thinker if you are attentive, get your facts straight, and have clear ideas about situations.

Yes, logical thinking is a soft skill that is tangible, easy to practice, and improves your reasoning abilities.

Economists, software developers, accountants, chemical engineers, technical writers, criminologists, and other related careers use logical thinking.

Logical thinkers are good at observing and analyzing situations, feedback, and reactions to draw rational conclusions.

Profile

Mounika Narang

Mounika Narang is a project manager having a specialisation in IT project management and Instructional Design. She has an experience of 10 years  working with Fortune 500 companies to solve their most important development challenges. She lives in Bangalore with her family.

Avail your free 1:1 mentorship session.

Something went wrong

Invest in Your Future: Upcoming High-Paying Career Courses to Consider

Course advisor icon

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors

Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Resources, Writing – original draft, Writing – review & editing

Affiliation School of Mathematics and Statistics, The University of Sydney, Sydney, Australia

Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Writing – original draft, Writing – review & editing

* E-mail: [email protected]

Affiliation School of Arts and Humanities, Edith Cowan University, Joondalup, Australia

ORCID logo

  • Clio Cresswell, 
  • Craig P. Speelman

PLOS

  • Published: July 29, 2020
  • https://doi.org/10.1371/journal.pone.0236153
  • Peer Review
  • Reader Comments

Fig 1

Mathematics is often promoted as endowing those who study it with transferable skills such as an ability to think logically and critically or to have improved investigative skills, resourcefulness and creativity in problem solving. However, there is scant evidence to back up such claims. This project tested participants with increasing levels of mathematics training on 11 well-studied rational and logical reasoning tasks aggregated from various psychological studies. These tasks, that included the Cognitive Reflection Test and the Wason Selection Task, are of particular interest as they have typically and reliably eluded participants in all studies, and results have been uncorrelated with general intelligence, education levels and other demographic information. The results in this study revealed that in general the greater the mathematics training of the participant, the more tasks were completed correctly, and that performance on some tasks was also associated with performance on others not traditionally associated. A ceiling effect also emerged. The work is deconstructed from the viewpoint of adding to the platform from which to approach the greater, and more scientifically elusive, question: are any skills associated with mathematics training innate or do they arise from skills transfer?

Citation: Cresswell C, Speelman CP (2020) Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors. PLoS ONE 15(7): e0236153. https://doi.org/10.1371/journal.pone.0236153

Editor: Jérôme Prado, French National Center for Scientific Research (CNRS) & University of Lyon, FRANCE

Received: January 13, 2020; Accepted: June 30, 2020; Published: July 29, 2020

Copyright: © 2020 Cresswell, Speelman. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: All relevant data are within the paper and its Supporting Information files.

Funding: The authors received no specific funding for this work.

Competing interests: The authors have declared that no competing interests exist.

Introduction

Mathematics is often promoted as endowing those who study it with a number of broad thinking skills such as: an ability to think logically, analytically, critically and abstractly; having capacity to weigh evidence with impartiality. This is a view of mathematics as providing transferable skills which can be found across educational institutions, governments and corporations worldwide. A view material to the place of mathematics in curricula.

Consider the UK government’s commissioned inquiry into mathematics education “Making Mathematics Count” ascertaining the justification that “mathematical training disciplines the mind, develops logical and critical reasoning, and develops analytical and problem-solving skills to a high degree” [ 1 p11]. The Australian Mathematical Sciences Institute very broadly states in its policy document “Vision for a Maths Nation” that “Not only is mathematics the enabling discipline, it has a vital productive role planning and protecting our well-being” (emphasis in original) [ 2 ]. In Canada, British Columbia’s New 2016 curriculum K-9 expressly mentions as part of its “Goals and Rationale”: “The Mathematics program of study is designed to develop deep mathematical understanding and fluency, logical reasoning, analytical thought, and creative thinking.” [ 3 ]. Universities, too, often make such specific claims with respect to their teaching programs. “Mathematics and statistics will help you to think logically and clearly, and apply a range of problem-solving strategies” is claimed by The School of Mathematical Sciences at Monash University, Australia [ 4 ]. The School of Mathematics and Statistics at The University of Sydney, Australia, directly attributes as part of particular course objectives and outcomes skills that include “enhance your problem-solving skills” as part of studies in first year [ 5 ], “develop logical thinking” as part of studies in second year, which was a statement drafted by the lead author in fact [ 6 ], and “be fluent in analysing and constructing logical arguments” as part of studies in third year [ 7 ]. The University of Cambridge’s Faculty of Mathematics, UK, provides a dedicated document “Transferable Skills in the Mathematical Tripos” as part of its undergraduate mathematics course information, which again lists “analytic ability; creativity; initiative; logical and methodical reasoning; persistence” [ 8 ].

In contrast, psychological research, which has been empirically investigating the concept of transferability of skills since the early 1900s, points quite oppositely to reasoning skills as being highly domain specific [ 9 ]. Therefore, support for claims that studying mathematics engenders more than specific mathematics knowledge is highly pertinent. And yet it is largely absent. The 2014 Centre for Curriculum Redesign (CCR) four part paper “Mathematics for the 21st Century: What Should Students Learn?” concludes in its fourth paper titled “Does mathematics education enhance higher-order thinking skills?” with a call to action “… there is not sufficient evidence to conclude that mathematics enhances higher order cognitive functions. The CCR calls for a much stronger cognitive psychology and neuroscience research base to be developed on the effects of studying mathematics” [ 10 ].

Inglis and Simpson [ 11 ], bringing up this very issue, examined the ability of first-year undergraduate students from a high-ranking UK university mathematics department, on the “Four Cards Problem” thinking task, also known as the Wason Selection Task. It is stated as follows.

Each of the following cards have a letter on one side and a number on the other.

logical reasoning and problem solving skills

Here is a rule: “if a card has a D on one side, then it has a 3 on the other”. Your task is to select all those cards, but only those cards, which you would have to turn over in order to find out whether the rule is true or false. Which cards would you select?

This task involves understanding conditional inference, namely understanding the rule “If P then Q” and with this, deducing the answer as “P and not Q” or “D and 7”. Such logical deduction indeed presents as a good candidate to test for a potential ability of the mathematically trained. This task has also been substantially investigated in the domain of the psychology of reasoning [ 12 p8] revealing across a wide range of publications that only around 10% of the general population reach the correct result. The predominant mistake being to pick “D and 3”; where in the original study by Wason [ 13 ] it is suggested that this was picked by 65% of people. This poor success rate along with a standard mistake has fuelled interest in the task as well as attempts to understand why it occurs. A prevailing theory being the so named matching bias effect; the effect of disproportionately concentrating on items specifically mentioned in the situation, as opposed to reasoning according to logical rules.

Inglis and Simpson’s results isolated mathematically trained individuals with respect to this task. The participants were under time constraint and 13% of the first-year undergraduate mathematics students sampled reached the correct response, compared to 4% of the non-mathematics (arts) students that was included. Of note also was the 24% of mathematics students as opposed to 45% of the non-mathematics students who chose the standard mistake. The study indeed unveiled that mathematically trained individuals were significantly less affected by the matching bias effect with this problem than the individuals without mathematics training. However, the achievement of the mathematically trained group was still far from masterful and the preponderance for a non-standard mistake compared with non-mathematically trained people is suggestive. Mathematical training appears to engender a different thinking style, but it remains unclear what the difference is.

Inglis, Simpson and colleagues proceeded to follow up their results with a number of studies concentrated on conditional inference in general [ 14 , 15 ]. A justification for this single investigatory pathway being that if transfer of knowledge is present, something subtle to test for in the first place, a key consideration should be the generalisation of learning rather than the application of skills learned in one context to another (where experimenter bias in the choice of contexts is more likely to be an issue). For this they typically used sixteen “if P then Q” comprehension tasks, where their samples across a number of studies have included 16-year-old pre-university mathematics students (from England and Cyprus), mathematics honours students in their first year of undergraduate university study, third year university mathematics students, and associated control groups. The studies have encompassed controls for general intelligence and thinking disposition prior to training, as well as follows ups of up to two years to address the issue of causation. The conclusive thinking pattern that has emerged is a tendency of the mathematical groups towards a greater likelihood of rejecting the invalid denial of the antecedent and affirmation of the consequent inferences. But with this, and this was validated by a second separate study, the English mathematics group actually became less likely to endorse the valid modus tollens inference. So again, mathematical training appears to engender a different thinking style, but there are subtleties and it remains unclear what the exact difference is.

This project was designed to broaden the search on the notion that mathematics training leads to increased reasoning skills. We focused on a range of reasoning problems considered in psychological research to be particularly insightful into decision making, critical thinking and logical deduction, with their distinction in that the general population generally struggles with answering them correctly. An Australian sample adds diversity to the current enquiries that have been European focussed. Furthermore, in an effort to identify the impact of mathematics training through a possible gradation effect, different levels of mathematically trained individuals were tested for performance.

Well-studied thinking tasks from a variety of psychological studies were chosen. Their descriptions, associated success rates and other pertinent details follows. They were all chosen as the correct answer is typically eluded for a standard mistake.

The three-item Cognitive Reflection Test (CRT) was used as introduced by Frederick [ 16 ]. This test was devised in line with the theory that there are two general types of cognitive activity: one that operates quickly and without reflection, and another that requires not only conscious thought and effort, but also an ability to reflect on one’s own cognition by including a step of suppression of the first to reach it. The three items in the test involve an incorrect “gut” response and further cognitive skill is deemed required to reach the correct answer (although see [ 17 ] for evidence that correct responses can result from “intuition”, which could be related to intelligence [ 18 ]).

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?

Bat and ball

A bat and a ball cost $1.10 in total. The bat costs a dollar more than the ball. How much does the ball cost?

The solutions are: 47 days for the Lily Pads problem, 5 minutes for the Widgets problem and 5 cents for the Bat and Ball problem. The considered intuitive, but wrong, answers are 24 days, 100 minutes and 10 cents, respectively. These wrong answers are attributed to participants becoming over focused on the numbers so as to ignore the exponential growth pattern in the Lily Pads problem, merely complete a pattern in numbers in the Widgets problem, and neglect the relationship “more than” in the Bat and Ball problem [ 19 ]. The original study by Frederick [ 16 ] provides a composite measure of the performance on these three items, with only 17% of those studied (n = 3428) reaching the perfect score. The CRT has since been studied extensively [ 19 – 21 ]. Research using the CRT tends not to report performance on the individual items of the test, but rather a composite measure of performance. Attridge and Inglis [ 22 ] used the CRT as a test for thinking disposition of mathematics students as one way to attempt to disentangle the issue of filtering according to prior thinking styles rather than transference of knowledge in successful problem solving. They repeat tested 16-year old pre-university mathematics students and English literature students without mathematics subjects at a one-year interval and found no difference between groups.

Three problems were included that test the ability to reason about probability. All three problems were originally discussed by Kahneman and Tversky [ 23 ], with the typically poor performance on these problems explained by participants relying not on probability knowledge, but a short-cut method of thinking known as the representativeness heuristic. In the late 1980s, Richard Nisbett and colleagues showed that graduate level training in statistics, while not revealing any improvement in logical reasoning, did correlate with higher-quality statistical answers [ 24 ]. Their studies lead in particular to the conclusion that comprehension of, what is known as the law of large numbers, did show improvement with training. The first of our next three problems targeted this law directly.

  • (a). the larger hospital
  • (b). the smaller hospital
  • (c). about the same (that is, within 5 percent of each other)

Kahneman and Tversky [ 23 ] reported that, of 50 participants, 12 chose (a), 10 chose (b), and 28 chose (c). The correct answer is (b), for the reason that small samples are more likely to exhibit extreme events than large samples from the same population. The larger the sample, the more likely it will exhibit characteristics of the parent population, such as the proportion of boys to girls. However, people tend to discount or be unaware of this feature of sampling statistics, which Kahneman and Tversky refer to as the law of large numbers. Instead, according to Kahneman and Tversky, people tend to adhere to a fallacious law of small numbers, where even small samples are expected to exhibit properties of the parent population, as illustrated by the proportion of participants choosing the answer (c) in their 1972 study. Such thinking reflects use of the representativeness heuristic, whereby someone will judge the likelihood of an uncertain event based on how similar it is to characteristics of the parent population of events.

Birth order

  • (a). What is your estimate of the number of families surveyed in which the exact order of births was BGBBBB?
  • (b). In the same survey set, which, if any, of the following two sequences would be more likely: BBBGGG or GBBGBG?

All of the events listed in the problem have an equal probability, so the correct answer to (a) is 72, and to (b) is “neither is more likely”. Kahneman and Tversky [ 23 ] reported that 75 of 92 participants judged the sequence in (a) as less likely than the given sequence. A similar number (unspecified by Kahneman and Tversky, but the statistical effect was reported to be of the same order as in (a)) reported that GBBGBG was the more likely sequence. Again, Kahneman and Tversky suggested that these results reflected use of the representativeness heuristic. In the context of this problem, the heuristic would have taken the following form: some birth orders appear less patterned than others, and less patterned is to be associated with the randomness of birth order, making them more likely.

Coin tosses

  • (a). H T H T H T H T
  • (b). H H H H T T T T
  • (c). T T H H T T H H
  • (d). H T T H T H H T
  • (e). all of the above are equally likely

The correct answer in this problem is (e). Kahneman and Tversky [ 23 ] reported that participants tend to choose less patterned looking sequences (e.g., H T T H T H H T) as more likely than more systematic looking sequences (e.g., H T H T H T H T). This reasoning again reflects the representativeness heuristic.

Three further questions from the literature were included to test problem solving skill.

Two drivers

  • (a). Driver A would win the race
  • (b). Driver B would win the race
  • (c). the two drivers would arrive at the same time (within a few seconds of one another)

This problem was developed by Pelham and Neter [ 25 ]. The correct answer is (a), which can be determined by calculations of driving times for each Driver, using time = distance/velocity. Pelham and Neter argue, however, that (c) is intuitively appealing, on the basis that both drivers appear to have the same overall average speed. Pelham and Neter reported that 67% of their sample gave this incorrect response to the problem, and a further 13% selected (b).

Petrol station

Imagine that you are driving along the road and you notice that your car is running low on petrol. You see two petrol stations next to each other, both advertising their petrol prices. Station A’s price is 65c/litre; Station B’s price is 60c/litre. Station A’s sign also announces: “5c/litre discount for cash!” Station B’s sign announces “5c/litre surcharge for credit cards.” All other factors being equal (for example, cleanliness of the stations, number of cars waiting at each etc), to which station would you choose to go, and why?

This problem was adapted from one described by Galotti [ 26 ], and is inspired by research reported by Thaler [ 27 ]. According to Thaler’s research, most people prefer Station A, even though both stations are offering the same deal: 60c/litre for cash, and 65c/litre for credit. Tversky and Kahneman [ 28 ] explain this preference by invoking the concept of framing effects. In the context of this problem, such an effect would involve viewing the outcomes as changes from some initial point. The initial point frames the problem, and provides a context for viewing the outcome. Thus, depending on the starting point, outcomes in this problem can be viewed as either a gain (in Station A, you gain a discount if you use cash) or a loss (in Station B, you are charged more (a loss) for using credit). Given that people are apparently more concerned about a loss than a gain [ 29 ], the loss associated with Station B makes it the less attractive option, and hence the preference for Station A. The correct answer, though, is that the stations are offering the same deal and so no station should be preferred.

And finally, a question described by Stanovich [ 30 , 31 ] as testing our predisposition for cognitive operations that require the least computational effort.

Jack looking at Anne

  • (c). Cannot be determined

Stanovich reported that over 80% of people choose the “lazy” answer (c). The correct answer is (a).

The above questions survey, in a clear problem solving setting, an ability to engage advanced cognitive processing in order to critically evaluate and possibly override initial gut reasoning, an ability to reason about probability within the framework of the law of large numbers and the relationship between randomness and patterning, an ability to isolate salient features of a problem and, with the last question in particular, an ability to map logical relations. It might be hypothesised that according to degrees of mathematical training, in line with the knowledge base provided and the claims of associated broad and enhanced problem-solving abilities in general, that participants with greater degrees of such training would outperform others on these questions. This hypothesis was investigated in this study. In addition, given that no previous study on this issue has examined the variety of problems used in this study, we also undertook an exploratory analysis to investigate whether there exist any associations between the problems in terms of their likelihood of correct solution. Similarities between problems might indicate which problem solving domains could be susceptible to the effects of mathematics training.

  • Introductory—First year, second semester, university students with weak high school mathematical results, only enrolled in the current unit as a compulsory component for their chosen degree, a unit not enabling any future mathematical pathway, a typical student may be enrolled in a Biology or Geography major;
  • Standard—First year, second semester, university students with fair to good high school mathematical results, enrolled in the current mathematics unit as a compulsory component for their chosen degree with the possibility of including some further mathematical units in their degree pathway, a typical student may be enrolled in an IT or Computer Science major;
  • Advanced1—First year, second semester, university mathematics students with very strong interest as well as background in mathematics, all higher year mathematical units are included as possible future pathway, a typical student may be enrolled in a Mathematics or Physics major;
  • Advanced2—Second year, second semester, university mathematics students with strong interest as well as background in mathematics, typically a direct follow on from the previously mentioned Advanced1 cohort;
  • Academic—Research academics in the mathematical sciences.

Participants

123 first year university students volunteered during “help on demand” tutorial times containing up to 30 students. These are course allocated times that are supervised yet self-directed by students. This minimised disruption and discouraged coercion. 44 second year university students completed the questionnaire during a weekly one-hour time slot dedicated to putting the latest mathematical concepts to practice with the lecturer (whereby contrast to what occurs in tutorial times the lecturer does most of the work and all students enrolled are invited). All these university students completed the questionnaire in normal classroom conditions; they were not placed under strict examination conditions. The lead author walked around to prevent discussion and coercion and there was minimum disruption. 30 research academics responded to local advertising and answered the questionnaire in their workplace while supervised.

The questionnaires were voluntary, anonymous and confidential. Participants were free to withdraw from the study at any time and without any penalty. No participant took this option however. The questionnaires gathered demographic information which included age, level of education attained and current qualification pursued, name of last qualification and years since obtaining it, and an option to note current speciality for research academics. Each problem task was placed on a separate page. Participants were not placed under time constraint, but while supervised, were asked to write their start and finish times on the front page of the survey to note approximate completion times. Speed of completion was not incentivised. Participants were not allowed to use calculators. A final “Comments Page” gave the option for feedback including specifically if the participants had previously seen any of the questions. Questionnaires were administered in person and supervised to avoid collusion or consulting of external sources.

The responses were coded four ways: A) correct; B) standard error (the errors discussed above in The Study); C) other error; D) left blank.

The ethical aspects of the study were approved by the Human Research Ethics Committee of the University of Sydney, protocol number [2016/647].

The first analysis examined the total number of correct responses provided by the participants as a function of group. Scores ranged from 1 to 11 out of a total possible of 11 (Problem 6 had 2 parts) ( Fig 1 ). An ANOVA of this data indicated a significant effect of group (F(4, 192) = 20.426, p < .001, partial η 2 = .299). Pairwise comparisons using Tukey’s HSD test indicated that the Introductory group performed significantly worse than the Advanced1, Advanced2 and Academic groups. There were no significant differences between the Advanced1, Advanced2 and Academic groups.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

Error bars are one standard error of the mean.

https://doi.org/10.1371/journal.pone.0236153.g001

Overall solution time, while recorded manually and approximately, was positively correlated with group, such that the more training someone had received, the longer were these solution times (r(180) = 0.247, p = .001). However, as can be seen in Fig 2 , this relationship is not strong.

thumbnail

https://doi.org/10.1371/journal.pone.0236153.g002

A series of chi-squared analyses, and their Bayesian equivalents, were performed on each problem, to determine whether the distribution of response types differed as a function of group. To minimise the number of cells in which expected values in some of these analyses were less than 5, the Standard Error, Other Error and Blank response categories were collapsed into one category (Incorrect Response). For three of the questions, the expected values of some cells did fall below 5, and this was due to most people getting the problem wrong (Four Cards), or most people correctly responding to the problem (Bat and Ball, Coin Tosses). In these cases, the pattern of results was so clear that a statistical analysis was barely required. Significant chi-squared results were examined further with pairwise posthoc comparisons (see Table 1 ).

thumbnail

https://doi.org/10.1371/journal.pone.0236153.t001

The three groups with the least amount of training in mathematics were far less likely than the other groups to give the correct solution (χ 2 (4) = 31.06, p < .001; BF 10 = 45,045) ( Table 1 ). People in the two most advanced groups (Advanced2 and Academic) were more likely to solve the card problem correctly, although it was still less than half of the people in these groups who did so. Further, these people were less likely to give the standard incorrect solution, so that most who were incorrect suggested some more cognitively elaborate answer, such as turning over all cards. The proportion of people in the Advanced2 and Academic groups (39 and 37%) who solved the problem correctly far exceeded the typical proportion observed with this problem (10%). Of note, also, is the relatively high proportion of those in the higher training groups who, when they made an error, did not make the standard error, a similar result to the one reported by Inglis and Simpson [ 11 ].

The cognitive reflection test

In the Lily Pads problem, although most people in the Standard, Advanced1, Advanced2 and Academic groups were likely to select the correct solution, it was also the case that the less training someone had received in mathematics, the more likely they were to select an incorrect solution (χ 2 (4) = 27.28, p < .001; BF 10 = 15,554), with the standard incorrect answer being the next most prevalent response for the two lower ability mathematics groups ( Table 1 ).

Performance on the Widgets problem was similar to performance on the Lily Pads problem in that most people in the Standard, Advanced1, Advanced2 and Academic groups were likely to select the correct solution, but that the less training someone had received in mathematics, the more likely they were to select an incorrect solution (χ 2 (4) = 23.76, p< .001; BF 10 = 516) ( Table 1 ). As with the Lily Pads and Widget problems, people in the Standard, Advanced1, Advanced2 and Academic groups were highly likely to solve the Bat and Ball problem (χ 2 (4) = 35.37, p < .001; BF 10 = 208,667). Errors were more likely from the least mathematically trained people (Introductory, Standard) than the other groups ( Table 1 ).

To compare performance on the CRT with previously published results, performance on the three problems (Lily Pads, Widgets, Bat and Ball) were combined. The number of people in each condition that solved 0, 1, 2, or 3 problems correctly is presented in Table 2 . The Introductory group were evenly distributed amongst the four categories, with 26% solving all three problems correctly. Around 70% of the rest of the groups solved all 3 problems correctly, which is vastly superior to the 17% reported by Frederick [ 16 ].

thumbnail

https://doi.org/10.1371/journal.pone.0236153.t002

Responses to the Hospitals problem were almost universally split between correct and standard errors in the Standard, Advanced1, Advanced2 and Academic groups. Although this pattern of responses was also evident in the Introductory group, this group also exhibited more non-standard errors and non-responses than the other groups. However, the differences between the groups were not significant (χ 2 (4) = 4.93, p = .295; BF 10 = .068) ( Table 1 ). Nonetheless, the performance of all groups exceeds the 20% correct response rate reported by Kahneman and Tversky [ 23 ].

The two versions of the Birth Order problem showed similar results, with correct responses being more likely in the groups with more training (i.e., Advanced1, Advanced2 and Academic), and responses being shared amongst the various categories in the Introductory and Standard groups (χ a 2 (4) = 24.54, p < .001; BF 10 = 1,303; χ b 2 (4) = 25.77, p < .001; BF 10 = 2,970) ( Table 1 ). Nonetheless, performance on both versions of the problem in this study was significantly better than the 82% error rate reported by Kahneman and Tversky [ 23 ].

The Coin Tosses problem was performed well by all groups, with very few people in any condition committing errors. There were no obvious differences between the groups (χ 2 (4) = 3.70, p = .448; BF 10 = .160) ( Table 1 ). Kahneman and Tversky [ 23 ] reported that people tend to make errors on this type of problem by choosing less patterned looking sequences, but they did not report relative proportions of people making errors versus giving correct responses. Clearly the sample in this study did not perform like those in Kahneman and Tversky’s study.

Responses on the Two Drivers problem were clearly distinguished by a high chance of error in the Introductory and Standard groups (over 80%), and a fairly good chance of being correct in the Advanced1, Advanced2 and Academic groups (χ 2 (4) = 46.16, p < .001; BF 10 = 1.32 x 10 8 ) ( Table 1 ). Academics were the standout performers on this problem, although over a quarter of this group produced an incorrect response. Thus, the first two groups performed similarly to the participants in the Pelham and Neter [ 25 ] study, 80% of whom gave an incorrect response.

Responses on the Petrol Station problem were marked by good performance by the Academic group (73% providing a correct response), and just over half of each of the other groups correctly solving the problem. This difference was not significant (χ 2 (4) = 4.68, p = .322: BF 10 = .059) ( Table 1 ). Errors were fairly evenly balanced between standard and other, except for the Academic group, who were more likely to provide a creative answer if they made an error. Thaler [ 27 ] reported that most people get this problem wrong. In this study, however, on average, most people got this problem correct, although this average was boosted by the Academic group.

Responses on the Jack looking at Anne problem generally were standard errors, except for the Advanced2 and Academic groups, which were evenly split between standard errors and correct responses (χ 2 (4) = 18.03, p = .001; BF 10 = 46) ( Table 1 ). Thus, apart from these two groups, the error rate in this study was similar to that reported by Stanovich [ 30 ], where 80% of participants were incorrect.

A series of logistic regression analyses were performed in order to examine whether the likelihood of solving a particular problem correctly could be predicted on the basis of whether other problems were solved correctly. Each analysis involved selecting performance (correct or error) on one problem as the outcome variable, and performance on the other problems as predictor variables. Training (amount of training) was also included as a predictor variable in each analysis. A further logistic regression was performed with training as the outcome variable, and performance on all of the problems as predictor variables. The results of these analyses are summarised in Table 3 . There were three multi-variable relationships observed in these analyses, which can be interpreted as the likelihood of solving one problem in each group being associated with solving the others in the set. These sets were: (1) Lily Pads, Widgets and Petrol Station; (2) Hospitals, Four Cards and Two Drivers; (3) Birth Order and Coin Tosses. Training also featured in each of these sets, moderating the relationships as per the results presented above for each problem.

thumbnail

https://doi.org/10.1371/journal.pone.0236153.t003

The final “Comments Page” revealed the participants as overwhelmingly enjoying the questions. Any analysis of previous exposure to the tasks proved impossible as there was little to no alignment on participant’s degree of recall, if any, and even perceptions of what exposure entailed. For example, some participants confused being exposed to the particular tasks with being habitually exposed to puzzles, or even mathematics problems, more broadly.

In general, the amount of mathematics training a group had received predicted their performance on the overall set of problems. The greater the training, the more problems were answered correctly, and the slower the recorded response times. There was not an obvious difference between the Advanced1, Advanced2 and Academic groups on either of these measures, however there were clear differences between this group and the Introductory and Standard groups, with the former exhibiting clearly superior accuracy. While time records were taken approximately, so as to avoid adding time pressure as a variable, that the Advanced1, Advanced2 and Academic groups recorded more time in their consideration of the problems, may suggest a “pause and consider” approach to such problems is a characteristic of the advanced groups. This is in line with what was suggested by an eye-movement tracking study of mathematically trained students attempting the Four Cards Problem; where participants that had not chosen the standard error had spent longer considering the card linked to the matching bias effect [ 14 ]. It is important to note, however, that longer response times may reflect other cognitive processes than deliberation [ 32 ].

Performance on some problems was associated with performance on other problems. That is, if someone correctly answered a problem in one of these sets, they were also highly likely to correctly answer the other problems in the set. These sets were: (1) Lily Pads, Widgets and Petrol Station; (2) Hospitals, Four Cards and Two Drivers; (3) Birth Order and Coin Tosses. This is different with how these problems have been typically clustered a priori in the research literature: (I) Lily Pads, Widgets and Bat and Ball (CRT); (II) Hospitals and Two Drivers (explained below); (III) Hospitals, Birth Order and Coin Tosses (representativeness heuristic); (IV) Birth Order and Coin Tosses (probability theory). Consideration of these problem groupings follows.

Correctly answering all three problems in (I) entailed not being distracted by particular pieces of information in the problems so as to stay focused on uncovering the real underlying relationships. The Lily Pads and Widget problems can mislead if attention is over focused on the numbers, and conversely, the Petrol Station problem can mislead if there is too much focus on the idea of a discount. While the Lily Pads and Widget problems are traditionally paired with the Bat and Ball problem in the CRT, it may be that performance on the Bat and Ball problem did not appear as part of this set due to an added level of difficulty. With the problems in (I), avoiding being distracted by certain parts of the questions at the expense of others almost leads directly to the correct answer. However, with the Bat and Ball problem, further steps in mathematical reasoning still need to occur in answering which two numbers add together to give a result while also subtracting one from the other for another.

With the problems in (II) it is of interest that the Two Drivers problem was created specifically to be paired with the Hospitals problem to test for motivation in problem solving [ 23 ]. Within this framework further transparent versions of these problems were successfully devised to manipulate for difficulty. The Two Drivers problem was amended to have Driver B travelling at exactly 5 mph during the first half of the race and at exactly 95 mph during the last half of the race. The Hospitals problem was amended so the smaller hospital would have “only 2” babies born each day and where for a period of one year the hospitals recorded the number of days on which all of the babies born were boys. Could the association in (II) be pointing to how participants overcome initial fictitious mathematical rules? Maybe they reframe the question in simpler terms to see the pattern. The Four Cards Problem also elicited a high number of incorrect answers where, associated with mathematical training, the standard incorrect solution was avoided for more cognitively elaborate ones. Indeed, a gradation effect appeared across the groups where the standard error of the “D and 3” cards becomes “D only” ( Table 4 ). Adrian Simpson and Derrick Watson found a comparable result across their two groups [14 p61]. This could again be pointing to having avoided an initial fictitious rule of simply concentrating on items directly found in the question, participants then seek to reframe the question to unearth the logical rule to be deduced. An added level of difficulty with this question may be why participants become trapped in a false answer. The eye-movement tracking study mentioned above supports this theory.

thumbnail

https://doi.org/10.1371/journal.pone.0236153.t004

The problems in (III) fit naturally together as part of basic probability theory, a topic participants would have assimilated, or not, as part of various education curricula. While the equal likelihood of all possible outcomes with respect to a coin toss may be culturally assimilated, the same may not be as straightforward for birth gender outcomes where such assumptions could be swayed by biological hypothesis or folk wisdom [ 33 ]. The gradation of the results in terms of mathematical training does not support this possibility.

The effect of training on performance accuracy was more obvious in some problems compared to others, and to some extent, this was related to the type of problem. For instance, most of the problems in which performance was related to training (Four Cards, CRT [Lily Pads, Widgets, Bat and Ball], Two Drivers, Jack looking at Anne) could be classed as relying on logical and/or critical thinking. The one exception was the Birth Order problems, which are probability related.

In contrast, two of the three problems in which training did not appear to have much impact on performance (Hospitals and Coin Tosses) require domain-specific knowledge. The Hospitals problem requires a degree of knowledge about sampling statistics. This is a topic of quite distinct flavour that not all mathematically trained individuals gain familiarity with. On the other hand, all groups having performed well on the Coin Tosses problem is in line with a level of familiarity with basic probability having been originally presented at high school. While the questioning of patterning as negatively correlated with randomness is similar to that appearing in the Birth Order question, in the Birth Order question this aspect is arguably more concealed. These results and problem grouping (III) could be pointing to an area for improvement in teaching where the small gap in knowledge required to go from answering the Coin Tosses problem correctly to achieving similarly with the Birth Order problem could be easily addressed. A more formal introduction to sampling statistics in mathematical training could potentially bridge this gap as well as further be extended towards improvement on the Hospitals problem.

The other problem where performance was unrelated to training, the Petrol Station problem, cannot be characterised similarly. It is more of a logical/critical thinking type problem, where there remains some suggestion that training may have impacted performance, as the Academic group seemed to perform better than the rest of the sample. An alternate interpretation of this result is therefore that this problem should not be isolated but grouped with the other problems where performance is affected by training.

  • The Introductory group’s mathematics high school syllabus studied prior to first semester course entry covered: Functions, Trigonometric Functions, Calculus (Introduction to Differentiation, Applications of the Derivative, Antiderivatives, Areas and the Definite Integral), Financial Mathematics, Statistical Analysis. The Introductory group then explored concepts in mathematical modelling with emphasis on the importance of calculus in their first semester of mathematical studies.
  • The Standard group’s mathematics high school syllabus studied prior to first semester course entry covered: Functions, Trigonometric Functions, Calculus (Rates of Change, Integration including the method of substitution, trigonometric identities and inverse trigonometric functions, Areas and Volumes of solids of revolution, some differential equations), Combinatorics, Proof (with particular focus on Proof by Mathematical Induction), Vectors (with application to projectile motion), Statistical Analysis. In first semester their mathematical studies then covered a number of topics the Advanced1 group studied prior to gaining entrance at university; further details on this are given below.
  • The Advanced1 group’s mathematics high school syllabus studied prior to first semester course entry covered: the same course content the Standard group covered at high school plus extra topics on Proof (develop rigorous mathematical arguments and proofs, specifically in the context of number and algebra and further develop Proof by Mathematical Induction), Vectors (3 dimensional vectors, vector equations of lines), Complex Numbers, Calculus (Further Integration techniques with partial fractions and integration by parts), Mechanics (Application of Calculus to Mechanics with simple harmonic motion, modelling motion without and with resistance, projectiles and resisted motion). The Standard group cover these topics in their first semester university studies in mathematics with the exclusion of further concepts of Proof or Mechanics. In first semester the Advanced1 group have built on their knowledge with an emphasis on both theoretical and foundational aspects, as well as developing the skill of applying mathematical theory to solve practical problems. Theoretical topics include a host of theorems relevant to the study of Calculus.

In summary, at the point of our study, the Advanced1 group had more knowledge and practice on rigorous mathematical arguments and proofs in the context of number and algebra, and more in-depth experience with Proofs by Induction, but the bulk of extra knowledge rests with a much deeper knowledge of Calculus. They have had longer experience with a variety of integration techniques, and have worked with a variety of applications of calculus to solve practical problems, including a large section on mechanics at high school. In first semester at university there has been a greater focus on theoretical topics including a host of theorems and associated proofs relevant to the topics studied. As compared to the Introductory and Standard groups, the Advanced1 group have only widened the mathematics knowledge gap since their choice of post-compulsory mathematics at high school. The Advanced2 group come directly from an Advanced1 cohort. And the Academics group would have reached the Advanced1 group’s proficiency as part of their employment. So, are specific reasoning skills resulting from this level of abstract reasoning? Our findings suggest this should certainly be an area of investigation and links in interestingly with other research work. In studying one of the thinking tasks in particular (the Four Cards Problem) and its context of conditional inference more specifically, Inglis and Simpson [ 15 ] found a clear difference between undergraduates in mathematics and undergraduates in other university disciplines, yet also showed a lack of development over first-year university studies on conditional inference measures. A follow up study by Attridge and Inglis [ 22 ] then zeroed in on post-compulsory high school mathematical training and found that students with such training did develop their conditional reasoning to a greater extent than their control group over the course of a year, despite them having received no explicit tuition in conditional logic. The development though, whilst demonstrated as not being the result of a domain-general change in cognitive capacity or thinking disposition, and most likely associated with the domain-specific study of mathematics, revealed a complex pattern of endorsing more of some inferences and less of others. The study here focused on a much broader problem set associated with logical and critical thinking and it too is suggestive of a more complex picture in how mathematics training may be contributing to problem solving styles. A more intricate pattern to do with the impact of mathematical training on problem solving techniques is appearing as required for consideration.

There is also a final interpretation to consider: that people in the Advanced 1, Advanced2 and Academic groups did not gain anything from their mathematics training in terms of their ability to solve these problems. Instead, with studies denying any correlation of many of these problems with what is currently measured as intelligence [ 30 ], they might still be people of a particular intelligence or thinking disposition to start with, who have been able to use that intelligence to not only solve these problems, but also survive the challenges of their mathematics training.

That the CRT has been traditionally used as a measure of baseline thinking disposition and that performance has been found to be immutable across groups tested is of particular interest since our results show a clear possible training effect on these questions. CRT is tied with a willingness to engage in effortful thinking which presents as a suitable ability for training. It is beyond the scope of this study, but a thorough review of CRT testing is suggestive of a broader appreciation and better framework to understand thinking disposition, ability and potential ability.

Mathematical training appears associated with certain thinking skills, but there are clearly some subtleties that need to be extricated. The thinking tasks here add to the foundational results where the aim is for a firmer platform on which to eventually base more targeted and illustrative inquiry. If thinking skills can be fostered, could first year university mathematics teaching be improved so that all samples from that group reach the Advanced1 group level of reasoning? Do university mathematics courses become purely about domain-specific knowledge from this point on? Intensive training has been shown to impact the brain and cognition across a number of domains from music [ 34 ], to video gaming [ 35 ], to Braille reading [ 36 ]. The hypothesis that mathematics, with its highly specific practice, fits within this list remains legitimate, but simply unchartered. With our current level of understanding it is worth appreciating the careful wording of the NYU Courant Institute on ‘Why Study Math?’ where there is no assumption of causation: “Mathematicians need to have good reasoning ability in order to identify, analyze, and apply basic logical principles to technical problems.” [ 37 ].

Limitations

One possible limitation of the current study is that the problems may have been too easy for the more advanced people, and so we observed a ceiling effect (i.e., some people obtained 100% correct on all problems). This was most obvious in the Advanced1, Advanced2 and Academic groups. It is possible that participants in these groups had developed logical and critical thinking skills throughout their mathematical training that were sufficient to cope with most of the problems used in this study, and so this would support the contention that training in mathematics leads to the development of logical and critical thinking skills useful in a range of domains. Another interpretation is that participants in these groups already possessed the necessary thinking skills for solving the problems in this study, which is why they are able to cope with the material in the advanced units they were enrolled in, or complete a PhD in mathematics and hold down an academic position in a mathematics department. This would then suggest that training in mathematics had no effect on abstract thinking skills—people in this study possessed them to varying extents prior to their studies. This issue might be settled in a future study that used a greater number of problems of varying difficulties to maximise the chances of finding a difference between the three groups with the most amount of training. Alternatively, a longitudinal study that followed people through their mathematics training could determine whether their logical and critical thinking abilities changed throughout their course.

A further limitation of the study may be that several of the reasoning biases examined in this study were measured by only one problem each (i.e., Four Cards Problem, Two Drivers, Petrol Station, Jack looking at Anne). A more reliable measure of these biases could be achieved by including more problems that tap into these biases. This would, however, increase the time required of participants during data collection, and in the context of this study, would mean a different mode of testing would likely be required.

Broad sweeping intuitive claims of the transferable skills endowed by a study of mathematics require evidence. Our study uniquely covers a wide range of participants, from limited mathematics training through to research academics in the mathematical sciences. It furthermore considered performance on 11 well-studied thinking tasks that typically elude participants in psychological studies and on which results have been uncorrelated with general intelligence, education levels and other demographic information [ 15 , 16 , 30 ]. We identified different performances on these tasks with respect to different groups, based on level of mathematical training. This included the CRT which has developed into a method of measuring baseline thinking disposition. We identified different distributions of types of errors for the mathematically trained. We furthermore identified a performance threshold that exists in first year university for those with high level mathematics training. This study then provides insight into possible changes and adjustments to mathematics courses in order for them to fulfil their advertised goal of reaching improved rational and logical reasoning for a higher number of students.

It is central to any education program to have a clear grasp of the nature of what it delivers and how, but arguably especially so for the core discipline that is mathematics. In 2014 the Office of The Chief Scientist of Australia released a report “Australia’s STEM workforce: a survey of employers” where transferable skills attributed to mathematics were also ones that employers deemed as part of the most valuable [ 38 ]. A better understanding of what mathematics delivers in this space is an opportunity to truly capitalise on this historical culture-crossing subject.

Supporting information

https://doi.org/10.1371/journal.pone.0236153.s001

Acknowledgments

The authors would like to thank Jacqui Ramagge for her proof reading and input, as well as support towards data collection.

  • 1. Smith A. Making mathematics count: The report of Professor Adrian Smith’s inquiry into post-14 mathematics education. 2004. London: The Stationery Office.
  • 2. AMSI, Vision for a Maths Nation. 2015. http://amsi.org.au/publications/a-vision-for-a-maths-nation/
  • 3. British Columbia [Internet]. Mathematics; Goals and Rationale. 2016 [cited 2019 Dec 5]. https://curriculum.gov.bc.ca/curriculum/mathematics/core/goals-and-rationale
  • 4. Monash University [Internet]. Mathematical Sciences. 2019 [cited 2019 Jul 30]. https://www.monash.edu/science/schools/mathematical-sciences/current .
  • 5. The University of Sydney [Internet]. MATH1014. 2017 [cited 2019 Dec 5]. http://www.maths.usyd.edu.au/u/UG/TU/YR1ADMIN/r/MATH1014.pdf .
  • 6. The University of Sydney [Internet]. MATH2965. 2016 [cited 2016 Dec 12]. http://www.maths.usyd.edu.au/u/UG/IM/MATH2965/
  • 7. The University of Sydney [Internet]. MATH3066. 2017 [cited 2017 Dec 8]. http://www.maths.usyd.edu.au/u/UG/SM/MATH3066/r/2017info3066.pdf .
  • 8. Cambridge University [Internet]. Mathematical Tripos. 2019 [cited 2019 Jul 30]. https://www.maths.cam.ac.uk/undergrad/course/transferable_skills .
  • 9. Speelman CP, Kirsner K. Beyond the learning curve: The construction of mind. Oxford: Oxford University Press; 2005.
  • 10. Fadel C. Mathematics for the 21 st Century: What Should Students Learn? Boston, Massachusetts: Center for Curriculum Redesign; 2014.
  • 11. Inglis M, Simpson A. Heuristic biases in mathematical reasoning. In: Chick HL, Vincent JL, editors. Proceedings of the 29th Conference of the International Group for the Psychology of Mathematics Education. Melbourne: PME; 2005. p. 177–84.
  • 12. Manktelow KI. Reasoning and Thinking. UK: Psychology Press; 1999.
  • View Article
  • PubMed/NCBI
  • Google Scholar
  • 14. Inglis M, Attridge N. Does mathematical study develop logical thinking? Testing the theory of formal discipline. London: World Scientific Publishing Europe Ltd; 2016.
  • 24. Nisbett RE. Can reasoning be taught? In: Callan E, Grotzer T, Kagan J, Nisbett RE, Perkins DN, Shulman LS, editors. Education and a Civic Society: Teaching Evidence-based Decision Making. Cambridge, MA: American Academy of Arts & Sciences; 2009.
  • 26. Galotti KM. Cognitive psychology in and out of the laboratory. Belmont, CA: Brooks/Cole; 1994.
  • 37. NYU [Internet]. Why Study Math? 2019 [cited 2019 Jul 30]. https://math.nyu.edu/dynamic/undergrad/overview/why-study-math/
  • 38. Office of The Chief Scientist. Australia’s STEM workforce: a survey of employers. Barton ACT: Deloitte Access Economics; 2014.

How logical reasoning works

logical reasoning cognitive skill

What is logical reasoning?

Logical reasoning is the process of using rational and systematic series of steps to come to a conclusion for a given statement. The situations that ask for logical reasoning require structure, a relationship between given facts and chains of reasoning that are sensible. Because you have to study a problem objectively with logical reasoning, analysing is an important factor within the process. Logical reasoning starts with a proposition or statement. This statement can be both true or false.  

Why is logical reasoning important?

Logical reasoning, in combination with other cognitive skills, is an important skill you use during all kinds of daily situations. It helps you make important decisions, discern the truth, solve problems, come up with new ideas and set achievable goals. Logical reasoning is also an important aspect of measuring intelligence during an IQ-test.  

The three types of logical reasoning

Logical reasoning can be divided into deductive-, inductive- and abductive reasoning. While inductive reasoning starts with a specific instance and moves into a generalized conclusion, deductive reasoning goes from a generalized principle that is known to be true to a specific conclusion that is true. And abductive reasoning is making a probable conclusion from what you know.  

logical reasoning types

We’ll explain each type of logical reasoning further:

Inductive reasoning

With inductive reasoning, a number of specific observations lead to a general rule. With this method, the premises are viewed as supplying some evidence for the truth of a conclusion. With inductive reasoning, there is an element of probability. In other words, forming a generalization based on what is known or observed.   While this sounds like the theory you will use during a debate or discussion, this is something you do every day in much simpler situations as well. We’ll explain this type of logical reasoning with an example: There are 28 balls within a basket, which are either red or white. To estimate the amount of red and white balls, you take a sample of four balls. The sample you took, exists out of three red and one white ball. Using good inductive generalization would be that there are 21 red and 7 white balls in the basket. As already explained, the conclusion drawn from his type of reasoning isn’t certain but is probable based on the evidence given (the sample of balls you took). Questions which require to perform inductive reasoning are a part of IQ-tests. An example of a little more complex question like just explained with the balls is the one of the image below. To come to a conclusion to solve this problem, both inductive reasoning and pattern recognition skills are required. Looking at the sequence of tiles with different patterns of dots, which tile should be on the place of the question mark? A, B, C, D, E or F?  

inductive reasoning example question

Deductive reasoning

With deductive reasoning, factual statements are used to come to a logical conclusion. If all the premises (factual statements) are true, the terms are clear and all the rules of deductive logic are followed to come to a conclusion, then the conclusion will also be true. In this case, the conclusion isn’t probable, but certain. Deductive reasoning is also known as “top-down” logic, because it (in most cases) starts with a general statement and will end with a specific conclusion.

We’ll explain deductive reasoning with an example, with 2 given premises:

It’s dangerous to drive while it’s freezing (premise 1)

It is currently freezing outside (premise 2)

So, we now know that it is dangerous to drive when it is freezing, and it is currently freezing outside. Using deductive reasoning, these two premises can help us form necessarily true conclusion, which is:

It is currently dangerous to drive outside (conclusion)

Situations in which you use deductive reasoning can come in many forms, such as mathematics. Whether you are designing your own garden or managing your time, you use deductive reasoning while doing math daily. An example is solving the following math problem:

All corners of a rectangle are always 180° (premise 1)

The following rectangle has one right angle, which is always 90° (premise 2)

The second angle is 60° (premise 3)  

deductive example math

How much degrees is the third angle (X)? To answer this question, you can use the three premises to come to the conclusion how much degrees the third hook is. The conclusion should be 180° (premise 1) -90° (premise 2) - 60° (premise 3) = 30° (conclusion)

Abductive reasoning

With abductive reasoning, the major premise is evident but the minor premise(s) is probable. Therefore, defining a conclusion would also make this conclusion probable. You start with an observation, followed by finding the most likely explanation for the observations. In other words, it is a type of logical reasoning you use when you form a conclusion with the (little) information that is known. An example of using abductive reasoning to come to a conclusion is a decision made by a jury. In this case, a group of people have to come to a solution based on the available evidence and witness testimonies. Based on this possibly incomplete information, they form a conclusion. A more common example is when you wake up in the morning, and you head downstairs. In the kitchen, you find a plate on the table, a half-eaten sandwich and half a glass of milk. From the premises that are available, you will come up with the most likely explanation for this. Which could be that your partner woke up before you and left in a hurry, without finishing his or her breakfast.  

inductive deductive abductive reasoning example

How does logical thinking relate to problem-solving?

As previously mentioned, the different types of logical reasoning (inductive, deductive and abductive) help you to form conclusions based on the current situation and known facts. This very closely correlates to problem-solving, as finding the most probable solution to resolve a problem is a similar conclusion. Logical thinking, and thereby problem solving, goes through the following five steps to draw a conclusion and/or find a solution:

Collecting information about the current situation. Determining what the current problem is, and what premises apply. Let’s say you want to go out for a drive, but it’s freezing outside.

Analyzing this information. What information is relevant to the situation, and what isn’t. In this case, the fact that it’s freezing is relevant for your safety on the road. The fact that you might get cold isn’t, as you’d be in your car.

Forming a conclusion. What can you conclude from this information? The roads might be more dangerous because it’s freezing.

Support your conclusion. You might look at traffic information to see that there have been more accidents today, in which case, that supports the conclusion that driving is more dangerous today.

  • Defend your conclusion. Is this conclusion correct for your case? If you don’t have winter tires it would be more accurate than when you do.  

problem solving steps

How to improve logical thinking and problem-solving skills?

Because there are so many different situations in which you use logical thinking and problem-solving, this isn’t a cognitive skill you can train specifically. Luckily, there are many methods that might help you to improve your logical thinking skills. These include methods to keep your general cognitive abilities healthy as well as methods to train your logical thinking skills. These are:

Learning something new

Social interaction

Healthy nutrition

Ensure enough sleep

Avoid stress

Preferably no alcohol

Spend time on creative hobbies

Practice questioning

Try to anticipate the outcome of your decisions

Brain training to challenge your logical reasoning skills

improve logical thinking skill

Recent posts

  • How winter affects your brain
  • SPECT: Brain Imaging
  • Smart Animals
  • Effects of alcohol on the brain
  • Athletes and Brain Training

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List

Logo of plosone

Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors

Clio cresswell.

1 School of Mathematics and Statistics, The University of Sydney, Sydney, Australia

Craig P. Speelman

2 School of Arts and Humanities, Edith Cowan University, Joondalup, Australia

Associated Data

All relevant data are within the paper and its Supporting Information files.

Mathematics is often promoted as endowing those who study it with transferable skills such as an ability to think logically and critically or to have improved investigative skills, resourcefulness and creativity in problem solving. However, there is scant evidence to back up such claims. This project tested participants with increasing levels of mathematics training on 11 well-studied rational and logical reasoning tasks aggregated from various psychological studies. These tasks, that included the Cognitive Reflection Test and the Wason Selection Task, are of particular interest as they have typically and reliably eluded participants in all studies, and results have been uncorrelated with general intelligence, education levels and other demographic information. The results in this study revealed that in general the greater the mathematics training of the participant, the more tasks were completed correctly, and that performance on some tasks was also associated with performance on others not traditionally associated. A ceiling effect also emerged. The work is deconstructed from the viewpoint of adding to the platform from which to approach the greater, and more scientifically elusive, question: are any skills associated with mathematics training innate or do they arise from skills transfer?

Introduction

Mathematics is often promoted as endowing those who study it with a number of broad thinking skills such as: an ability to think logically, analytically, critically and abstractly; having capacity to weigh evidence with impartiality. This is a view of mathematics as providing transferable skills which can be found across educational institutions, governments and corporations worldwide. A view material to the place of mathematics in curricula.

Consider the UK government’s commissioned inquiry into mathematics education “Making Mathematics Count” ascertaining the justification that “mathematical training disciplines the mind, develops logical and critical reasoning, and develops analytical and problem-solving skills to a high degree” [ 1 p11]. The Australian Mathematical Sciences Institute very broadly states in its policy document “Vision for a Maths Nation” that “Not only is mathematics the enabling discipline, it has a vital productive role planning and protecting our well-being” (emphasis in original) [ 2 ]. In Canada, British Columbia’s New 2016 curriculum K-9 expressly mentions as part of its “Goals and Rationale”: “The Mathematics program of study is designed to develop deep mathematical understanding and fluency, logical reasoning, analytical thought, and creative thinking.” [ 3 ]. Universities, too, often make such specific claims with respect to their teaching programs. “Mathematics and statistics will help you to think logically and clearly, and apply a range of problem-solving strategies” is claimed by The School of Mathematical Sciences at Monash University, Australia [ 4 ]. The School of Mathematics and Statistics at The University of Sydney, Australia, directly attributes as part of particular course objectives and outcomes skills that include “enhance your problem-solving skills” as part of studies in first year [ 5 ], “develop logical thinking” as part of studies in second year, which was a statement drafted by the lead author in fact [ 6 ], and “be fluent in analysing and constructing logical arguments” as part of studies in third year [ 7 ]. The University of Cambridge’s Faculty of Mathematics, UK, provides a dedicated document “Transferable Skills in the Mathematical Tripos” as part of its undergraduate mathematics course information, which again lists “analytic ability; creativity; initiative; logical and methodical reasoning; persistence” [ 8 ].

In contrast, psychological research, which has been empirically investigating the concept of transferability of skills since the early 1900s, points quite oppositely to reasoning skills as being highly domain specific [ 9 ]. Therefore, support for claims that studying mathematics engenders more than specific mathematics knowledge is highly pertinent. And yet it is largely absent. The 2014 Centre for Curriculum Redesign (CCR) four part paper “Mathematics for the 21st Century: What Should Students Learn?” concludes in its fourth paper titled “Does mathematics education enhance higher-order thinking skills?” with a call to action “… there is not sufficient evidence to conclude that mathematics enhances higher order cognitive functions. The CCR calls for a much stronger cognitive psychology and neuroscience research base to be developed on the effects of studying mathematics” [ 10 ].

Inglis and Simpson [ 11 ], bringing up this very issue, examined the ability of first-year undergraduate students from a high-ranking UK university mathematics department, on the “Four Cards Problem” thinking task, also known as the Wason Selection Task. It is stated as follows.

Each of the following cards have a letter on one side and a number on the other.

equation image

Here is a rule: “if a card has a D on one side, then it has a 3 on the other”. Your task is to select all those cards, but only those cards, which you would have to turn over in order to find out whether the rule is true or false. Which cards would you select?

This task involves understanding conditional inference, namely understanding the rule “If P then Q” and with this, deducing the answer as “P and not Q” or “D and 7”. Such logical deduction indeed presents as a good candidate to test for a potential ability of the mathematically trained. This task has also been substantially investigated in the domain of the psychology of reasoning [ 12 p8] revealing across a wide range of publications that only around 10% of the general population reach the correct result. The predominant mistake being to pick “D and 3”; where in the original study by Wason [ 13 ] it is suggested that this was picked by 65% of people. This poor success rate along with a standard mistake has fuelled interest in the task as well as attempts to understand why it occurs. A prevailing theory being the so named matching bias effect; the effect of disproportionately concentrating on items specifically mentioned in the situation, as opposed to reasoning according to logical rules.

Inglis and Simpson’s results isolated mathematically trained individuals with respect to this task. The participants were under time constraint and 13% of the first-year undergraduate mathematics students sampled reached the correct response, compared to 4% of the non-mathematics (arts) students that was included. Of note also was the 24% of mathematics students as opposed to 45% of the non-mathematics students who chose the standard mistake. The study indeed unveiled that mathematically trained individuals were significantly less affected by the matching bias effect with this problem than the individuals without mathematics training. However, the achievement of the mathematically trained group was still far from masterful and the preponderance for a non-standard mistake compared with non-mathematically trained people is suggestive. Mathematical training appears to engender a different thinking style, but it remains unclear what the difference is.

Inglis, Simpson and colleagues proceeded to follow up their results with a number of studies concentrated on conditional inference in general [ 14 , 15 ]. A justification for this single investigatory pathway being that if transfer of knowledge is present, something subtle to test for in the first place, a key consideration should be the generalisation of learning rather than the application of skills learned in one context to another (where experimenter bias in the choice of contexts is more likely to be an issue). For this they typically used sixteen “if P then Q” comprehension tasks, where their samples across a number of studies have included 16-year-old pre-university mathematics students (from England and Cyprus), mathematics honours students in their first year of undergraduate university study, third year university mathematics students, and associated control groups. The studies have encompassed controls for general intelligence and thinking disposition prior to training, as well as follows ups of up to two years to address the issue of causation. The conclusive thinking pattern that has emerged is a tendency of the mathematical groups towards a greater likelihood of rejecting the invalid denial of the antecedent and affirmation of the consequent inferences. But with this, and this was validated by a second separate study, the English mathematics group actually became less likely to endorse the valid modus tollens inference. So again, mathematical training appears to engender a different thinking style, but there are subtleties and it remains unclear what the exact difference is.

This project was designed to broaden the search on the notion that mathematics training leads to increased reasoning skills. We focused on a range of reasoning problems considered in psychological research to be particularly insightful into decision making, critical thinking and logical deduction, with their distinction in that the general population generally struggles with answering them correctly. An Australian sample adds diversity to the current enquiries that have been European focussed. Furthermore, in an effort to identify the impact of mathematics training through a possible gradation effect, different levels of mathematically trained individuals were tested for performance.

Well-studied thinking tasks from a variety of psychological studies were chosen. Their descriptions, associated success rates and other pertinent details follows. They were all chosen as the correct answer is typically eluded for a standard mistake.

The three-item Cognitive Reflection Test (CRT) was used as introduced by Frederick [ 16 ]. This test was devised in line with the theory that there are two general types of cognitive activity: one that operates quickly and without reflection, and another that requires not only conscious thought and effort, but also an ability to reflect on one’s own cognition by including a step of suppression of the first to reach it. The three items in the test involve an incorrect “gut” response and further cognitive skill is deemed required to reach the correct answer (although see [ 17 ] for evidence that correct responses can result from “intuition”, which could be related to intelligence [ 18 ]).

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?

Bat and ball

A bat and a ball cost $1.10 in total. The bat costs a dollar more than the ball. How much does the ball cost?

The solutions are: 47 days for the Lily Pads problem, 5 minutes for the Widgets problem and 5 cents for the Bat and Ball problem. The considered intuitive, but wrong, answers are 24 days, 100 minutes and 10 cents, respectively. These wrong answers are attributed to participants becoming over focused on the numbers so as to ignore the exponential growth pattern in the Lily Pads problem, merely complete a pattern in numbers in the Widgets problem, and neglect the relationship “more than” in the Bat and Ball problem [ 19 ]. The original study by Frederick [ 16 ] provides a composite measure of the performance on these three items, with only 17% of those studied (n = 3428) reaching the perfect score. The CRT has since been studied extensively [ 19 – 21 ]. Research using the CRT tends not to report performance on the individual items of the test, but rather a composite measure of performance. Attridge and Inglis [ 22 ] used the CRT as a test for thinking disposition of mathematics students as one way to attempt to disentangle the issue of filtering according to prior thinking styles rather than transference of knowledge in successful problem solving. They repeat tested 16-year old pre-university mathematics students and English literature students without mathematics subjects at a one-year interval and found no difference between groups.

Three problems were included that test the ability to reason about probability. All three problems were originally discussed by Kahneman and Tversky [ 23 ], with the typically poor performance on these problems explained by participants relying not on probability knowledge, but a short-cut method of thinking known as the representativeness heuristic. In the late 1980s, Richard Nisbett and colleagues showed that graduate level training in statistics, while not revealing any improvement in logical reasoning, did correlate with higher-quality statistical answers [ 24 ]. Their studies lead in particular to the conclusion that comprehension of, what is known as the law of large numbers, did show improvement with training. The first of our next three problems targeted this law directly.

A certain town is served by two hospitals. In the larger hospital, about 45 babies are born each day, and in the smaller hospital, about 15 babies are born each day. As you know, about 50 percent of all babies are boys. However, the exact percentage varies from day to day. Sometimes it may be higher than 50 percent, sometimes lower. For a period of one year, each hospital recorded the number of days on which more than 60 percent of the babies born were boys. Which hospital do you think recorded more such days? (Circle one letter.)

  • (a) the larger hospital
  • (b) the smaller hospital
  • (c) about the same (that is, within 5 percent of each other)

Kahneman and Tversky [ 23 ] reported that, of 50 participants, 12 chose (a), 10 chose (b), and 28 chose (c). The correct answer is (b), for the reason that small samples are more likely to exhibit extreme events than large samples from the same population. The larger the sample, the more likely it will exhibit characteristics of the parent population, such as the proportion of boys to girls. However, people tend to discount or be unaware of this feature of sampling statistics, which Kahneman and Tversky refer to as the law of large numbers. Instead, according to Kahneman and Tversky, people tend to adhere to a fallacious law of small numbers, where even small samples are expected to exhibit properties of the parent population, as illustrated by the proportion of participants choosing the answer (c) in their 1972 study. Such thinking reflects use of the representativeness heuristic, whereby someone will judge the likelihood of an uncertain event based on how similar it is to characteristics of the parent population of events.

Birth order

All families of six children in a city were surveyed. In 72 families the exact order of births of boys and girls was GBGBBG.

  • (a) What is your estimate of the number of families surveyed in which the exact order of births was BGBBBB?
  • (b) In the same survey set, which, if any, of the following two sequences would be more likely: BBBGGG or GBBGBG?

All of the events listed in the problem have an equal probability, so the correct answer to (a) is 72, and to (b) is “neither is more likely”. Kahneman and Tversky [ 23 ] reported that 75 of 92 participants judged the sequence in (a) as less likely than the given sequence. A similar number (unspecified by Kahneman and Tversky, but the statistical effect was reported to be of the same order as in (a)) reported that GBBGBG was the more likely sequence. Again, Kahneman and Tversky suggested that these results reflected use of the representativeness heuristic. In the context of this problem, the heuristic would have taken the following form: some birth orders appear less patterned than others, and less patterned is to be associated with the randomness of birth order, making them more likely.

Coin tosses

In a sequence of coin tosses (the coin is fair) which of the following outcomes would be most likely (circle one letter):

  • (a) H T H T H T H T
  • (b) H H H H T T T T
  • (c) T T H H T T H H
  • (d) H T T H T H H T
  • (e) all of the above are equally likely

The correct answer in this problem is (e). Kahneman and Tversky [ 23 ] reported that participants tend to choose less patterned looking sequences (e.g., H T T H T H H T) as more likely than more systematic looking sequences (e.g., H T H T H T H T). This reasoning again reflects the representativeness heuristic.

Three further questions from the literature were included to test problem solving skill.

Two drivers

Two drivers set out on a 100-mile race that is marked off into two 50-mile sections. Driver A travels at exactly 50 miles per hour during the entire race. Driver B travels at exactly 45 mph during the first half of the race (up to the 50-mile marker) and travels at exactly 55 mph during the last half of the race (up to the finish line). Which of the two drivers would win the race? (Circle one letter.)

  • (a) Driver A would win the race
  • (b) Driver B would win the race
  • (c) the two drivers would arrive at the same time (within a few seconds of one another)

This problem was developed by Pelham and Neter [ 25 ]. The correct answer is (a), which can be determined by calculations of driving times for each Driver, using time = distance/velocity. Pelham and Neter argue, however, that (c) is intuitively appealing, on the basis that both drivers appear to have the same overall average speed. Pelham and Neter reported that 67% of their sample gave this incorrect response to the problem, and a further 13% selected (b).

Petrol station

Imagine that you are driving along the road and you notice that your car is running low on petrol. You see two petrol stations next to each other, both advertising their petrol prices. Station A’s price is 65c/litre; Station B’s price is 60c/litre. Station A’s sign also announces: “5c/litre discount for cash!” Station B’s sign announces “5c/litre surcharge for credit cards.” All other factors being equal (for example, cleanliness of the stations, number of cars waiting at each etc), to which station would you choose to go, and why?

This problem was adapted from one described by Galotti [ 26 ], and is inspired by research reported by Thaler [ 27 ]. According to Thaler’s research, most people prefer Station A, even though both stations are offering the same deal: 60c/litre for cash, and 65c/litre for credit. Tversky and Kahneman [ 28 ] explain this preference by invoking the concept of framing effects. In the context of this problem, such an effect would involve viewing the outcomes as changes from some initial point. The initial point frames the problem, and provides a context for viewing the outcome. Thus, depending on the starting point, outcomes in this problem can be viewed as either a gain (in Station A, you gain a discount if you use cash) or a loss (in Station B, you are charged more (a loss) for using credit). Given that people are apparently more concerned about a loss than a gain [ 29 ], the loss associated with Station B makes it the less attractive option, and hence the preference for Station A. The correct answer, though, is that the stations are offering the same deal and so no station should be preferred.

And finally, a question described by Stanovich [ 30 , 31 ] as testing our predisposition for cognitive operations that require the least computational effort.

Jack looking at Anne

Jack is looking at Anne, but Anne is looking at George. Jack is married, but George is not. Is a married person looking at an unmarried person? (Circle one letter.)

  • (c) Cannot be determined

Stanovich reported that over 80% of people choose the “lazy” answer (c). The correct answer is (a).

The above questions survey, in a clear problem solving setting, an ability to engage advanced cognitive processing in order to critically evaluate and possibly override initial gut reasoning, an ability to reason about probability within the framework of the law of large numbers and the relationship between randomness and patterning, an ability to isolate salient features of a problem and, with the last question in particular, an ability to map logical relations. It might be hypothesised that according to degrees of mathematical training, in line with the knowledge base provided and the claims of associated broad and enhanced problem-solving abilities in general, that participants with greater degrees of such training would outperform others on these questions. This hypothesis was investigated in this study. In addition, given that no previous study on this issue has examined the variety of problems used in this study, we also undertook an exploratory analysis to investigate whether there exist any associations between the problems in terms of their likelihood of correct solution. Similarities between problems might indicate which problem solving domains could be susceptible to the effects of mathematics training.

A questionnaire was constructed containing the problems described in the previous sections plus the Four Cards Problem as tested by Inglis and Simpson [ 11 ] for comparison. The order of the problems was as follows: 1) Lily Pads; 2) Hospitals; 3) Widgets; 4) Four Cards; 5) Bat and Ball; 6) Birth Order; 7) Petrol Station; 8) Coin Tosses; 9) Two Drivers; 10) Jack looking at Anne. It was administered to five groups distinctive in mathematics training levels chosen from a high-ranking Australian university, where the teaching year is separated into two teaching semesters and where being a successful university applicant requires having been highly ranked against peers in terms of intellectual achievement:

  • Introductory—First year, second semester, university students with weak high school mathematical results, only enrolled in the current unit as a compulsory component for their chosen degree, a unit not enabling any future mathematical pathway, a typical student may be enrolled in a Biology or Geography major;
  • Standard—First year, second semester, university students with fair to good high school mathematical results, enrolled in the current mathematics unit as a compulsory component for their chosen degree with the possibility of including some further mathematical units in their degree pathway, a typical student may be enrolled in an IT or Computer Science major;
  • Advanced1—First year, second semester, university mathematics students with very strong interest as well as background in mathematics, all higher year mathematical units are included as possible future pathway, a typical student may be enrolled in a Mathematics or Physics major;
  • Advanced2—Second year, second semester, university mathematics students with strong interest as well as background in mathematics, typically a direct follow on from the previously mentioned Advanced1 cohort;
  • Academic—Research academics in the mathematical sciences.

Participants

123 first year university students volunteered during “help on demand” tutorial times containing up to 30 students. These are course allocated times that are supervised yet self-directed by students. This minimised disruption and discouraged coercion. 44 second year university students completed the questionnaire during a weekly one-hour time slot dedicated to putting the latest mathematical concepts to practice with the lecturer (whereby contrast to what occurs in tutorial times the lecturer does most of the work and all students enrolled are invited). All these university students completed the questionnaire in normal classroom conditions; they were not placed under strict examination conditions. The lead author walked around to prevent discussion and coercion and there was minimum disruption. 30 research academics responded to local advertising and answered the questionnaire in their workplace while supervised.

The questionnaires were voluntary, anonymous and confidential. Participants were free to withdraw from the study at any time and without any penalty. No participant took this option however. The questionnaires gathered demographic information which included age, level of education attained and current qualification pursued, name of last qualification and years since obtaining it, and an option to note current speciality for research academics. Each problem task was placed on a separate page. Participants were not placed under time constraint, but while supervised, were asked to write their start and finish times on the front page of the survey to note approximate completion times. Speed of completion was not incentivised. Participants were not allowed to use calculators. A final “Comments Page” gave the option for feedback including specifically if the participants had previously seen any of the questions. Questionnaires were administered in person and supervised to avoid collusion or consulting of external sources.

The responses were coded four ways: A) correct; B) standard error (the errors discussed above in The Study); C) other error; D) left blank.

The ethical aspects of the study were approved by the Human Research Ethics Committee of the University of Sydney, protocol number [2016/647].

The first analysis examined the total number of correct responses provided by the participants as a function of group. Scores ranged from 1 to 11 out of a total possible of 11 (Problem 6 had 2 parts) ( Fig 1 ). An ANOVA of this data indicated a significant effect of group (F(4, 192) = 20.426, p < .001, partial η 2 = .299). Pairwise comparisons using Tukey’s HSD test indicated that the Introductory group performed significantly worse than the Advanced1, Advanced2 and Academic groups. There were no significant differences between the Advanced1, Advanced2 and Academic groups.

An external file that holds a picture, illustration, etc.
Object name is pone.0236153.g001.jpg

Error bars are one standard error of the mean.

Overall solution time, while recorded manually and approximately, was positively correlated with group, such that the more training someone had received, the longer were these solution times (r(180) = 0.247, p = .001). However, as can be seen in Fig 2 , this relationship is not strong.

An external file that holds a picture, illustration, etc.
Object name is pone.0236153.g002.jpg

A series of chi-squared analyses, and their Bayesian equivalents, were performed on each problem, to determine whether the distribution of response types differed as a function of group. To minimise the number of cells in which expected values in some of these analyses were less than 5, the Standard Error, Other Error and Blank response categories were collapsed into one category (Incorrect Response). For three of the questions, the expected values of some cells did fall below 5, and this was due to most people getting the problem wrong (Four Cards), or most people correctly responding to the problem (Bat and Ball, Coin Tosses). In these cases, the pattern of results was so clear that a statistical analysis was barely required. Significant chi-squared results were examined further with pairwise posthoc comparisons (see Table 1 ).

Superscripts label the groups (e.g., Introductory = a). Within the table, these letters refer to which other group a particular group was significantly different to according to a series of pairwise post hoc chi squared analyses (Bonferroni corrected α = .005) (e.g., ‘d’ in the Introductory column indicates the Introductory and the Advanced2 (d) group were significantly different for a particular problem).

The three groups with the least amount of training in mathematics were far less likely than the other groups to give the correct solution (χ 2 (4) = 31.06, p < .001; BF 10 = 45,045) ( Table 1 ). People in the two most advanced groups (Advanced2 and Academic) were more likely to solve the card problem correctly, although it was still less than half of the people in these groups who did so. Further, these people were less likely to give the standard incorrect solution, so that most who were incorrect suggested some more cognitively elaborate answer, such as turning over all cards. The proportion of people in the Advanced2 and Academic groups (39 and 37%) who solved the problem correctly far exceeded the typical proportion observed with this problem (10%). Of note, also, is the relatively high proportion of those in the higher training groups who, when they made an error, did not make the standard error, a similar result to the one reported by Inglis and Simpson [ 11 ].

The cognitive reflection test

In the Lily Pads problem, although most people in the Standard, Advanced1, Advanced2 and Academic groups were likely to select the correct solution, it was also the case that the less training someone had received in mathematics, the more likely they were to select an incorrect solution (χ 2 (4) = 27.28, p < .001; BF 10 = 15,554), with the standard incorrect answer being the next most prevalent response for the two lower ability mathematics groups ( Table 1 ).

Performance on the Widgets problem was similar to performance on the Lily Pads problem in that most people in the Standard, Advanced1, Advanced2 and Academic groups were likely to select the correct solution, but that the less training someone had received in mathematics, the more likely they were to select an incorrect solution (χ 2 (4) = 23.76, p< .001; BF 10 = 516) ( Table 1 ). As with the Lily Pads and Widget problems, people in the Standard, Advanced1, Advanced2 and Academic groups were highly likely to solve the Bat and Ball problem (χ 2 (4) = 35.37, p < .001; BF 10 = 208,667). Errors were more likely from the least mathematically trained people (Introductory, Standard) than the other groups ( Table 1 ).

To compare performance on the CRT with previously published results, performance on the three problems (Lily Pads, Widgets, Bat and Ball) were combined. The number of people in each condition that solved 0, 1, 2, or 3 problems correctly is presented in Table 2 . The Introductory group were evenly distributed amongst the four categories, with 26% solving all three problems correctly. Around 70% of the rest of the groups solved all 3 problems correctly, which is vastly superior to the 17% reported by Frederick [ 16 ].

Responses to the Hospitals problem were almost universally split between correct and standard errors in the Standard, Advanced1, Advanced2 and Academic groups. Although this pattern of responses was also evident in the Introductory group, this group also exhibited more non-standard errors and non-responses than the other groups. However, the differences between the groups were not significant (χ 2 (4) = 4.93, p = .295; BF 10 = .068) ( Table 1 ). Nonetheless, the performance of all groups exceeds the 20% correct response rate reported by Kahneman and Tversky [ 23 ].

The two versions of the Birth Order problem showed similar results, with correct responses being more likely in the groups with more training (i.e., Advanced1, Advanced2 and Academic), and responses being shared amongst the various categories in the Introductory and Standard groups (χ a 2 (4) = 24.54, p < .001; BF 10 = 1,303; χ b 2 (4) = 25.77, p < .001; BF 10 = 2,970) ( Table 1 ). Nonetheless, performance on both versions of the problem in this study was significantly better than the 82% error rate reported by Kahneman and Tversky [ 23 ].

The Coin Tosses problem was performed well by all groups, with very few people in any condition committing errors. There were no obvious differences between the groups (χ 2 (4) = 3.70, p = .448; BF 10 = .160) ( Table 1 ). Kahneman and Tversky [ 23 ] reported that people tend to make errors on this type of problem by choosing less patterned looking sequences, but they did not report relative proportions of people making errors versus giving correct responses. Clearly the sample in this study did not perform like those in Kahneman and Tversky’s study.

Responses on the Two Drivers problem were clearly distinguished by a high chance of error in the Introductory and Standard groups (over 80%), and a fairly good chance of being correct in the Advanced1, Advanced2 and Academic groups (χ 2 (4) = 46.16, p < .001; BF 10 = 1.32 x 10 8 ) ( Table 1 ). Academics were the standout performers on this problem, although over a quarter of this group produced an incorrect response. Thus, the first two groups performed similarly to the participants in the Pelham and Neter [ 25 ] study, 80% of whom gave an incorrect response.

Responses on the Petrol Station problem were marked by good performance by the Academic group (73% providing a correct response), and just over half of each of the other groups correctly solving the problem. This difference was not significant (χ 2 (4) = 4.68, p = .322: BF 10 = .059) ( Table 1 ). Errors were fairly evenly balanced between standard and other, except for the Academic group, who were more likely to provide a creative answer if they made an error. Thaler [ 27 ] reported that most people get this problem wrong. In this study, however, on average, most people got this problem correct, although this average was boosted by the Academic group.

Responses on the Jack looking at Anne problem generally were standard errors, except for the Advanced2 and Academic groups, which were evenly split between standard errors and correct responses (χ 2 (4) = 18.03, p = .001; BF 10 = 46) ( Table 1 ). Thus, apart from these two groups, the error rate in this study was similar to that reported by Stanovich [ 30 ], where 80% of participants were incorrect.

A series of logistic regression analyses were performed in order to examine whether the likelihood of solving a particular problem correctly could be predicted on the basis of whether other problems were solved correctly. Each analysis involved selecting performance (correct or error) on one problem as the outcome variable, and performance on the other problems as predictor variables. Training (amount of training) was also included as a predictor variable in each analysis. A further logistic regression was performed with training as the outcome variable, and performance on all of the problems as predictor variables. The results of these analyses are summarised in Table 3 . There were three multi-variable relationships observed in these analyses, which can be interpreted as the likelihood of solving one problem in each group being associated with solving the others in the set. These sets were: (1) Lily Pads, Widgets and Petrol Station; (2) Hospitals, Four Cards and Two Drivers; (3) Birth Order and Coin Tosses. Training also featured in each of these sets, moderating the relationships as per the results presented above for each problem.

P = Problem (1 = Four Cards; 2 = Lily Pads; 3 = Widgets; 4 = Bat & Ball; 5 = Hospitals; 6a = Birth Order (a); 6b = Birth Order (b); 7 = Coin Tosses; 8 = Two Drivers; 9 = Petrol Station; 10 = Jack looking at Anne).

training = Amount of training condition.

p = significance level of logistic regression model.

% = percentage of cases correctly classified by the logistic regression model.

✓ = significant predictor, α < .05.

* = logistic regression for the training outcome variable is multinomial, whereas all other logistic regressions are binomial.

The final “Comments Page” revealed the participants as overwhelmingly enjoying the questions. Any analysis of previous exposure to the tasks proved impossible as there was little to no alignment on participant’s degree of recall, if any, and even perceptions of what exposure entailed. For example, some participants confused being exposed to the particular tasks with being habitually exposed to puzzles, or even mathematics problems, more broadly.

In general, the amount of mathematics training a group had received predicted their performance on the overall set of problems. The greater the training, the more problems were answered correctly, and the slower the recorded response times. There was not an obvious difference between the Advanced1, Advanced2 and Academic groups on either of these measures, however there were clear differences between this group and the Introductory and Standard groups, with the former exhibiting clearly superior accuracy. While time records were taken approximately, so as to avoid adding time pressure as a variable, that the Advanced1, Advanced2 and Academic groups recorded more time in their consideration of the problems, may suggest a “pause and consider” approach to such problems is a characteristic of the advanced groups. This is in line with what was suggested by an eye-movement tracking study of mathematically trained students attempting the Four Cards Problem; where participants that had not chosen the standard error had spent longer considering the card linked to the matching bias effect [ 14 ]. It is important to note, however, that longer response times may reflect other cognitive processes than deliberation [ 32 ].

Performance on some problems was associated with performance on other problems. That is, if someone correctly answered a problem in one of these sets, they were also highly likely to correctly answer the other problems in the set. These sets were: (1) Lily Pads, Widgets and Petrol Station; (2) Hospitals, Four Cards and Two Drivers; (3) Birth Order and Coin Tosses. This is different with how these problems have been typically clustered a priori in the research literature: (I) Lily Pads, Widgets and Bat and Ball (CRT); (II) Hospitals and Two Drivers (explained below); (III) Hospitals, Birth Order and Coin Tosses (representativeness heuristic); (IV) Birth Order and Coin Tosses (probability theory). Consideration of these problem groupings follows.

Correctly answering all three problems in (I) entailed not being distracted by particular pieces of information in the problems so as to stay focused on uncovering the real underlying relationships. The Lily Pads and Widget problems can mislead if attention is over focused on the numbers, and conversely, the Petrol Station problem can mislead if there is too much focus on the idea of a discount. While the Lily Pads and Widget problems are traditionally paired with the Bat and Ball problem in the CRT, it may be that performance on the Bat and Ball problem did not appear as part of this set due to an added level of difficulty. With the problems in (I), avoiding being distracted by certain parts of the questions at the expense of others almost leads directly to the correct answer. However, with the Bat and Ball problem, further steps in mathematical reasoning still need to occur in answering which two numbers add together to give a result while also subtracting one from the other for another.

With the problems in (II) it is of interest that the Two Drivers problem was created specifically to be paired with the Hospitals problem to test for motivation in problem solving [ 23 ]. Within this framework further transparent versions of these problems were successfully devised to manipulate for difficulty. The Two Drivers problem was amended to have Driver B travelling at exactly 5 mph during the first half of the race and at exactly 95 mph during the last half of the race. The Hospitals problem was amended so the smaller hospital would have “only 2” babies born each day and where for a period of one year the hospitals recorded the number of days on which all of the babies born were boys. Could the association in (II) be pointing to how participants overcome initial fictitious mathematical rules? Maybe they reframe the question in simpler terms to see the pattern. The Four Cards Problem also elicited a high number of incorrect answers where, associated with mathematical training, the standard incorrect solution was avoided for more cognitively elaborate ones. Indeed, a gradation effect appeared across the groups where the standard error of the “D and 3” cards becomes “D only” ( Table 4 ). Adrian Simpson and Derrick Watson found a comparable result across their two groups [14 p61]. This could again be pointing to having avoided an initial fictitious rule of simply concentrating on items directly found in the question, participants then seek to reframe the question to unearth the logical rule to be deduced. An added level of difficulty with this question may be why participants become trapped in a false answer. The eye-movement tracking study mentioned above supports this theory.

The problems in (III) fit naturally together as part of basic probability theory, a topic participants would have assimilated, or not, as part of various education curricula. While the equal likelihood of all possible outcomes with respect to a coin toss may be culturally assimilated, the same may not be as straightforward for birth gender outcomes where such assumptions could be swayed by biological hypothesis or folk wisdom [ 33 ]. The gradation of the results in terms of mathematical training does not support this possibility.

The effect of training on performance accuracy was more obvious in some problems compared to others, and to some extent, this was related to the type of problem. For instance, most of the problems in which performance was related to training (Four Cards, CRT [Lily Pads, Widgets, Bat and Ball], Two Drivers, Jack looking at Anne) could be classed as relying on logical and/or critical thinking. The one exception was the Birth Order problems, which are probability related.

In contrast, two of the three problems in which training did not appear to have much impact on performance (Hospitals and Coin Tosses) require domain-specific knowledge. The Hospitals problem requires a degree of knowledge about sampling statistics. This is a topic of quite distinct flavour that not all mathematically trained individuals gain familiarity with. On the other hand, all groups having performed well on the Coin Tosses problem is in line with a level of familiarity with basic probability having been originally presented at high school. While the questioning of patterning as negatively correlated with randomness is similar to that appearing in the Birth Order question, in the Birth Order question this aspect is arguably more concealed. These results and problem grouping (III) could be pointing to an area for improvement in teaching where the small gap in knowledge required to go from answering the Coin Tosses problem correctly to achieving similarly with the Birth Order problem could be easily addressed. A more formal introduction to sampling statistics in mathematical training could potentially bridge this gap as well as further be extended towards improvement on the Hospitals problem.

The other problem where performance was unrelated to training, the Petrol Station problem, cannot be characterised similarly. It is more of a logical/critical thinking type problem, where there remains some suggestion that training may have impacted performance, as the Academic group seemed to perform better than the rest of the sample. An alternate interpretation of this result is therefore that this problem should not be isolated but grouped with the other problems where performance is affected by training.

Although several aspects of the data suggest mathematics training improves the chances that someone will solve problems of the sort examined here, differences in the performance of participants in the Advanced1, Advanced2 and Academic groups were not obvious. This is despite the fact that large differences exist in the amount of training in these three groups. The first two groups were undergraduate students and the Academic group all had PhDs and many were experienced academic staff. One interpretation of this result is current mathematics training can only take someone so far in terms of improving their abilities with these problems. There is a point of demarcation to consider in terms of mathematical knowledge between the Advanced1, Advanced2 and Academic groups as compared to the Introductory and Standard groups. In Australia students are able to drop mathematical study at ages 15–16 years, or choose between a number of increasingly involved levels of mathematics. For the university in this study, students are filtered upon entry into mathematics courses according to their current knowledge status. All our groups involved students who had opted for post-compulsory mathematics at high school. And since our testing occurred in second semester, some of the mathematical knowledge shortfalls that were there upon arrival were bridged in first semester. Students must pass a first semester course to be allowed entry into the second semester course. A breakdown of the mathematics background of each group is as follows:

  • The Introductory group’s mathematics high school syllabus studied prior to first semester course entry covered: Functions, Trigonometric Functions, Calculus (Introduction to Differentiation, Applications of the Derivative, Antiderivatives, Areas and the Definite Integral), Financial Mathematics, Statistical Analysis. The Introductory group then explored concepts in mathematical modelling with emphasis on the importance of calculus in their first semester of mathematical studies.
  • The Standard group’s mathematics high school syllabus studied prior to first semester course entry covered: Functions, Trigonometric Functions, Calculus (Rates of Change, Integration including the method of substitution, trigonometric identities and inverse trigonometric functions, Areas and Volumes of solids of revolution, some differential equations), Combinatorics, Proof (with particular focus on Proof by Mathematical Induction), Vectors (with application to projectile motion), Statistical Analysis. In first semester their mathematical studies then covered a number of topics the Advanced1 group studied prior to gaining entrance at university; further details on this are given below.
  • The Advanced1 group’s mathematics high school syllabus studied prior to first semester course entry covered: the same course content the Standard group covered at high school plus extra topics on Proof (develop rigorous mathematical arguments and proofs, specifically in the context of number and algebra and further develop Proof by Mathematical Induction), Vectors (3 dimensional vectors, vector equations of lines), Complex Numbers, Calculus (Further Integration techniques with partial fractions and integration by parts), Mechanics (Application of Calculus to Mechanics with simple harmonic motion, modelling motion without and with resistance, projectiles and resisted motion). The Standard group cover these topics in their first semester university studies in mathematics with the exclusion of further concepts of Proof or Mechanics. In first semester the Advanced1 group have built on their knowledge with an emphasis on both theoretical and foundational aspects, as well as developing the skill of applying mathematical theory to solve practical problems. Theoretical topics include a host of theorems relevant to the study of Calculus.

In summary, at the point of our study, the Advanced1 group had more knowledge and practice on rigorous mathematical arguments and proofs in the context of number and algebra, and more in-depth experience with Proofs by Induction, but the bulk of extra knowledge rests with a much deeper knowledge of Calculus. They have had longer experience with a variety of integration techniques, and have worked with a variety of applications of calculus to solve practical problems, including a large section on mechanics at high school. In first semester at university there has been a greater focus on theoretical topics including a host of theorems and associated proofs relevant to the topics studied. As compared to the Introductory and Standard groups, the Advanced1 group have only widened the mathematics knowledge gap since their choice of post-compulsory mathematics at high school. The Advanced2 group come directly from an Advanced1 cohort. And the Academics group would have reached the Advanced1 group’s proficiency as part of their employment. So, are specific reasoning skills resulting from this level of abstract reasoning? Our findings suggest this should certainly be an area of investigation and links in interestingly with other research work. In studying one of the thinking tasks in particular (the Four Cards Problem) and its context of conditional inference more specifically, Inglis and Simpson [ 15 ] found a clear difference between undergraduates in mathematics and undergraduates in other university disciplines, yet also showed a lack of development over first-year university studies on conditional inference measures. A follow up study by Attridge and Inglis [ 22 ] then zeroed in on post-compulsory high school mathematical training and found that students with such training did develop their conditional reasoning to a greater extent than their control group over the course of a year, despite them having received no explicit tuition in conditional logic. The development though, whilst demonstrated as not being the result of a domain-general change in cognitive capacity or thinking disposition, and most likely associated with the domain-specific study of mathematics, revealed a complex pattern of endorsing more of some inferences and less of others. The study here focused on a much broader problem set associated with logical and critical thinking and it too is suggestive of a more complex picture in how mathematics training may be contributing to problem solving styles. A more intricate pattern to do with the impact of mathematical training on problem solving techniques is appearing as required for consideration.

There is also a final interpretation to consider: that people in the Advanced 1, Advanced2 and Academic groups did not gain anything from their mathematics training in terms of their ability to solve these problems. Instead, with studies denying any correlation of many of these problems with what is currently measured as intelligence [ 30 ], they might still be people of a particular intelligence or thinking disposition to start with, who have been able to use that intelligence to not only solve these problems, but also survive the challenges of their mathematics training.

That the CRT has been traditionally used as a measure of baseline thinking disposition and that performance has been found to be immutable across groups tested is of particular interest since our results show a clear possible training effect on these questions. CRT is tied with a willingness to engage in effortful thinking which presents as a suitable ability for training. It is beyond the scope of this study, but a thorough review of CRT testing is suggestive of a broader appreciation and better framework to understand thinking disposition, ability and potential ability.

Mathematical training appears associated with certain thinking skills, but there are clearly some subtleties that need to be extricated. The thinking tasks here add to the foundational results where the aim is for a firmer platform on which to eventually base more targeted and illustrative inquiry. If thinking skills can be fostered, could first year university mathematics teaching be improved so that all samples from that group reach the Advanced1 group level of reasoning? Do university mathematics courses become purely about domain-specific knowledge from this point on? Intensive training has been shown to impact the brain and cognition across a number of domains from music [ 34 ], to video gaming [ 35 ], to Braille reading [ 36 ]. The hypothesis that mathematics, with its highly specific practice, fits within this list remains legitimate, but simply unchartered. With our current level of understanding it is worth appreciating the careful wording of the NYU Courant Institute on ‘Why Study Math?’ where there is no assumption of causation: “Mathematicians need to have good reasoning ability in order to identify, analyze, and apply basic logical principles to technical problems.” [ 37 ].

Limitations

One possible limitation of the current study is that the problems may have been too easy for the more advanced people, and so we observed a ceiling effect (i.e., some people obtained 100% correct on all problems). This was most obvious in the Advanced1, Advanced2 and Academic groups. It is possible that participants in these groups had developed logical and critical thinking skills throughout their mathematical training that were sufficient to cope with most of the problems used in this study, and so this would support the contention that training in mathematics leads to the development of logical and critical thinking skills useful in a range of domains. Another interpretation is that participants in these groups already possessed the necessary thinking skills for solving the problems in this study, which is why they are able to cope with the material in the advanced units they were enrolled in, or complete a PhD in mathematics and hold down an academic position in a mathematics department. This would then suggest that training in mathematics had no effect on abstract thinking skills—people in this study possessed them to varying extents prior to their studies. This issue might be settled in a future study that used a greater number of problems of varying difficulties to maximise the chances of finding a difference between the three groups with the most amount of training. Alternatively, a longitudinal study that followed people through their mathematics training could determine whether their logical and critical thinking abilities changed throughout their course.

A further limitation of the study may be that several of the reasoning biases examined in this study were measured by only one problem each (i.e., Four Cards Problem, Two Drivers, Petrol Station, Jack looking at Anne). A more reliable measure of these biases could be achieved by including more problems that tap into these biases. This would, however, increase the time required of participants during data collection, and in the context of this study, would mean a different mode of testing would likely be required.

Broad sweeping intuitive claims of the transferable skills endowed by a study of mathematics require evidence. Our study uniquely covers a wide range of participants, from limited mathematics training through to research academics in the mathematical sciences. It furthermore considered performance on 11 well-studied thinking tasks that typically elude participants in psychological studies and on which results have been uncorrelated with general intelligence, education levels and other demographic information [ 15 , 16 , 30 ]. We identified different performances on these tasks with respect to different groups, based on level of mathematical training. This included the CRT which has developed into a method of measuring baseline thinking disposition. We identified different distributions of types of errors for the mathematically trained. We furthermore identified a performance threshold that exists in first year university for those with high level mathematics training. This study then provides insight into possible changes and adjustments to mathematics courses in order for them to fulfil their advertised goal of reaching improved rational and logical reasoning for a higher number of students.

It is central to any education program to have a clear grasp of the nature of what it delivers and how, but arguably especially so for the core discipline that is mathematics. In 2014 the Office of The Chief Scientist of Australia released a report “Australia’s STEM workforce: a survey of employers” where transferable skills attributed to mathematics were also ones that employers deemed as part of the most valuable [ 38 ]. A better understanding of what mathematics delivers in this space is an opportunity to truly capitalise on this historical culture-crossing subject.

Supporting information

Acknowledgments.

The authors would like to thank Jacqui Ramagge for her proof reading and input, as well as support towards data collection.

Funding Statement

The authors received no specific funding for this work.

Data Availability

  • PLoS One. 2020; 15(7): e0236153.

Decision Letter 0

17 Mar 2020

PONE-D-20-01159

Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors

Dear Professor Speelman,

Thank you for submitting your manuscript to PLOS ONE. I have sent it to two expert reviewers and have received their comments back. As you can see at the bottom of this email, both reviewers are positive about your manuscript but raise some issues that you would need to address before the manuscript can be considered for publication. Notably, reviewer #1 points out that the manuscript should include a discussion on the reasons why individuals with math training may have improved reasoning skills (e.g., logical intuitions versus deliberate thinking). The reviewer also rightly mentions that your sample sizes are limited, notably for the most advanced groups. This should be discussed and acknowledged. Reviewer #2 has a number of conceptual and methodological points that you will also have to address. The reviewer provides very thorough comments and I will not reiterate the points here. However, note that both reviewers suggest that you need to improve the figures and I agree with them.   

We would appreciate receiving your revised manuscript by May 01 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

We look forward to receiving your revised manuscript.

Kind regards,

Jérôme Prado

Academic Editor

Journal Requirements:

When submitting your revision, we need you to address these additional requirements:

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at http://www.plosone.org/attachments/PLOSOne_formatting_sample_main_body.pdf and http://www.plosone.org/attachments/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please include additional information regarding the survey or questionnaire used in the study and ensure that you have provided sufficient details that others could replicate the analyses. For instance, if you developed a questionnaire as part of this study and it is not under a copyright more restrictive than CC-BY, please include a copy, in both the original language and English, as Supporting Information. Please also let us know if it would be possible to provide the anonymized data points necessary to replicate the statistical analyses, for instance, as shown in fig 1 and 2. If so, please deposit those to a suitable data repository or include them in the Supporting Information files.

3. Thank you for stating the following financial disclosure:

"The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

  • Please provide an amended Funding Statement that declares *all* the funding or sources of support received during this specific study (whether external or internal to your organization) as detailed online in our guide for authors at http://journals.plos.org/plosone/s/submit-now .  
  • Please state what role the funders took in the study.  If any authors received a salary from any of your funders, please state which authors and which funder. If the funders had no role, please state: "The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer #1: I think this is a very good and interesting manuscript trying to answer an important research question. I propose some changes that I believe should be applied before publication.

1. Each reasoning bias is measured with only one problem. In reasoning research, it is rather common to measure each type of reasoning problem with a series of structurally equivalent reasoning problems, so the results will be independent of contexts effects and will be generalizable to that type of problem. Here, the authors only measured each reasoning bias with one single problem and this might be problematic (see, for example: Fiedler & Hertel, 1994). I think this can be addressed by simply discussing it in the limitation section.

2. This is rather a minor issue, but the discussion on the CRT problems is not up-to-date (page 7). Most recent experiments on dual process theory suggest that people who are able to correctly solve these reasoning problems (including the CRT) do so intuitively, and not because they engaged in careful deliberation (Bago & De Neys, 2019). Intelligence made people have better intuitive responses (Thompson, Pennycook, Trippas & Evans, 2018). Similarly, this problems persists in the discussion about reaction times (page 25). Longer reaction times does not necessarily mean that people engaged in deliberation (see: Evans, Kyle, Dillon & Rand, 2015). Response time might be driven by decision conflict or response rationalization. These issues could be clarified with some changes in the wording or some footnotes on page 7 and 25. Furthermore, it would be interesting to have a discussion on how mathematical education helps people overcome their biases. Is it because it creates better intuition, or helps people engage in deliberation? An interesting question this manuscript does not discuss. It’s on the authors whether or not they discuss this latter point now, but the changes on page 7 and 25 should be made.

3. A more serious problem is the rather small sample size (especially in the more advanced groups). This small sample size makes the appearance of both false negatives and false positives more likely. Perhaps, the authors could compute the Bayes Factors for the chi-square or logistic regression test, so we can actually see how strong the evidence is for or against the null. This is especially important as the authors run a great number of explorative analysis (Table 3), and some of those results might need to be interpreted with great caution (depending on the Bayes Factor).

The graphs are not looking good, they should comply with APA formatting. At the very least, the axis titles should be meaningful and measure units should be written there.

The presentation order of the problems is quite unusual; why isn’t it random? Why did the authors decide on this order?

Reviewer #2: The study reported in this paper compared five groups of participants with varying levels of mathematical expertise on a set of reasoning tasks. The study is interesting and informative. It extends the current literature on this topic (which is reviewed very nicely in the introduction). However, there are some issues with the current analysis and interpretation that should be resolved prior to publication. I have therefore recommended major revisions. My comments are organised in the order in which they came up in the paper and they explain my responses to the questions above.

1. Line 114 – “general population” a bit misleading – they were also students but from other disciplines.

2. Line 124 onwards reads:

“The ultimate question to consider here is: are any skills associated with mathematics training innate or do they arise from skills transfer? Though to investigate how mathematical training affects reasoning skills, randomised sampling and randomised intervention to reveal causal relationships are clearly not viable. With so many possible confounding variables and logistical issues, it is even questionable what conclusions such studies might provide. Furthermore, a firm baseline from which to propose more substantive investigations is still missing.”

I find this paragraph slightly problematic because the current study doesn’t inform us on this ultimate question, so it makes the outline of the current study in the following paragraph feel unsatisfactory. I think the current study is important but prefacing it with this paragraph underplays that importance. And I think a randomised controlled study, although not viable, would give the answers we need because the random allocation to groups would allow us to rule out any confounding variables. Finally, the last sentence in this paragraph is unclear to me.

3. In the descriptions of the five participants groups the authors refer to the group’s level of interest in mathematics, but this seems like an overgeneralisation to me. Surely the introductory group could contain a biology student who also happens to be good at mathematics and very much enjoy it? I would be more comfortable with the descriptions if the parts about interest level were removed.

4. How many of the 123 first year students were in each of the three first year groups?

5. Line 313 – the standard group is referred to as “university mathematics students”, but they are not taking mathematics degreed.

6. Line 331 - what is a practice class?

7. Were the data collection settings quiet? From the description it sounds like groups of participants were completing the study at the same time in the same room, but the authors should make this explicit for the sake of the method being reproducible. E.g. how many students were in the room at the time?

8. Line 355-356 – the authors should not use the term “marginally worse” because this is statistically inappropriate – in a frequentist approach results are either significant or non-significant.

9. Line 340 – “approximate completion times were noted.”

This doesn’t sound rigorous enough to justify analysing them. Their analysis is interesting, but the authors should remind readers clearly whenever the response times are analysed or discussed that their recording was only manual and approximate.

10. I suggest replacing Figure 1 with a bar chart showing standard error of the mean on the error bars. A table with mean score out of 11 and the standard deviation for each group may also be useful. Figure 2 should be a scatterplot rather than a box and whisker plot.

11. Was the 0-11 total correct score approximately normally distributed across the full sample?

12. Chi square analysis requires at least 5 cases in each cell, was this met? It seems not since Table 1 shows lots of cells in the “no response” row having 0% of cases.

13. The chi-square analyses should be followed up with post hoc tests to see exactly where the differences between groups are. The descriptions as they stand aren’t that informative (as readers can just look at Table 1) without being backed up by post hoc tests.

14. For each chi square analysis in the text, I would find it easier to read if the test statistics came at the top of the paragraph, before the description.

15. Line 381-383 – “Of note, also, is the relatively low proportion of those in the higher training groups who, when they made an error, did not make the standard error, a similar result to the one reported by Inglis and Simpson [11]."

I think this is supposed to say that a low proportion did make the standard error or that a high proportion did not make the standard error.

16. Line 403 - p values this small should be reported as p < .001 rather than p = .000 since they aren’t actually 0.

17. Line 476 – “…if a particular outcome variable was predicted significantly by a particular predictor variable, the converse relationship was also observed”

Isn’t that necessarily the case with regression analyses, like with correlations?

18. I don’t think the logistic regression analyses add much to the paper and at the moment they come across as potential p-hacking since they don’t clearly relate to the research question. To me they make the paper feel less focused. Having said that, there is some interesting discussion of them in the Discussion section. I’d recommend adding some justification to the introduction for why it is interesting to look at the relationships among tasks (without pretending to have made any specific hypotheses about the relationships, of course).

19. Line 509 would be clearer if it read “between these groups and the introductory and standard groups”

20. Lines 597 – 620 - This is an interesting discussion, especially the suggestion that advanced calculus may be responsible for the development. No development in reasoning skills from the beginning of a mathematics degree onwards was also found by Inglis and Simpson (2009), who suggested that the initial difference between mathematics and non-mathematics undergraduates could have been due to pre-university study of mathematics. Attridge & Inglis (2013) found evidence that this was the case (they found no difference between mathematics and non-mathematics students at age 16 but a significant difference at the end of the academic year, where the mathematics students had improved and the non-mathematics students had not).

Could the authors add some discussion of whether something similar may have been the case with their Australian sample? E.g. do students in Australia choose whether, or to what extent, to study mathematics towards the end of high school? If not, the description of the groups suggests that there were at least differences in high school mathematics attainment between groups 1-3, even if they studied the same mathematics curriculum. Do the authors think that this difference in attainment could have led to the differences between groups in the current study?

21. Line 617 – “Intensive training has been shown to impact the brain and cognition across a number of domains from music, to video gaming, to Braille reading [31].”

Reference 31 appears to only relate to music. Please add references for video gaming and Braille reading.

22. I recommend editing the figures from SPSS’s default style or re-making them in Excel or DataGraph to look more attractive.

23. I cannot find the associated datafile anywhere in the submission. Apologies if this is my mistake.

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at gro.solp@serugif . Please note that Supporting Information files do not need this step.

Author response to Decision Letter 0

20 Apr 2020

All responses are detailed against the specific reviewers' comments in the Response to Reviewers document

Submitted filename: Response to Reviewers.docx

Decision Letter 1

11 Jun 2020

PONE-D-20-01159R1

Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors.

Dear Dr. Speelman,

Thank you for submitting your revised manuscript to PLOS ONE. I have sent it to reviewer #2 and have now received the reviewer's comment. As you can see, the reviewer thinks that the manuscript is improved but has some outstanding issues that you would need to address in another round of revision. I notably agree with the reviewer that you should provide the raw data, allowing readers to replicate your analyses. Therefore, I invite you submit a revised version of your manuscript.

Please submit your revised manuscript by Jul 26 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at gro.solp@enosolp . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see:  http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Reviewer #2: The manuscript has improved but there are still a few issues that should be resolved prior to publication.

1. On lines 96, 97, 100 and 102, the references to “general population” should be changed to reflect the fact that these participants were non-mathematics (arts) students.

2. Line 306 – change “mathematics students” to “university students”.

3. The method section doesn’t specify the gender split and mean age of the sample.

4. Table 3 - values the p values listed as .000 should be changed to <.001.

5. Table 3 - I suggest repeating the list of problem numbers and names in the legend. It may make for a long legend but would make it much easier for the reader to interpret the table.

6. I am not sure what the new post hoc tests are comparing. What I expected was to see group 1 compared to groups 2, 3, 4 and 5, and so on. This would tell us which groups are statistically different from each other. At the moment we only know from the overall chi square tests whether there are any differences among the groups or not, we don’t know specifically which groups are statistically different from each other and which ones are not. We only have the authors’ interpretations based on the observed counts.

7. Line 584 - change “performance was correlated with training” to “performance was related to training” to avoid any confusion since a correlation analysis was not performed.

8. Data file – I had expected the data file to give the raw data rather than summary data, i.e. with each participant in a separate row, and a column indicating their group membership, a column giving their age, a column for sex etc (including all the demographics mentioned in the method), and a column for each reasoning question. This would allow other researchers to replicate the regression analyses and look at other relationships within the dataset. Without being able to replicate all analyses in the paper, the data file does not meet the minimal data set definition for publication in PLOS journals: https://journals.plos.org/plosone/s/data-availability .

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,  https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at  gro.solp@serugif . Please note that Supporting Information files do not need this step.

Author response to Decision Letter 1

16 Jun 2020

Please see "Response to Reviewers" document

Decision Letter 2

PONE-D-20-01159R2

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/ , click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at gro.solp@gnillibrohtua .

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact gro.solp@sserpeno .

Additional Editor Comments (optional):

Acceptance letter

Dear Dr. Speelman:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact gro.solp@sserpeno .

If we can help with anything else, please email us at gro.solp@enosolp .

Thank you for submitting your work to PLOS ONE and supporting open access.

PLOS ONE Editorial Office Staff

on behalf of

Dr. Jérôme Prado

  • IBPS RRB Exam 2023 - Free Course
  • Current Affairs
  • General Knowledge
  • SSC CGL Pre.Yrs.Papers
  • SSC CGL Practice Papers
  • SBI Clerk PYQ
  • IBPS PO PYQ
  • IBPS Clerk PYQ
  • SBI PO Practice Paper

Related Articles

Logical Reasoning Questions and Answers

  • Number Series Reasoning Questions and Answers
  • Alphanumeric Series Reasoning Questions and Answers
  • Analogy Reasoning Questions and Answers
  • Making Judgements: Reasoning Questions
  • Course of Action: Logical Reasoning Questions
  • Statement and Conclusion Logical Reasoning
  • Cause and Effect: Logical Reasoning Questions and Answers
  • Statement and Argument-Analytical Reasoning
  • Logical Deduction Questions and Answers (2023)
  • HCL Placement Paper | Verbal Reasoning Set - 2
  • Reasoning Tricks to Solve Coding -Decoding and Calendar Problems
  • Statement and Assumption in Logical Reasoning
  • Venn Diagram

Logical Reasoning _ Verbal Reasoning

  • Verbal Reasoning: Logical Arrangement Of Words
  • Placement | Reasoning | Blood Relationship
  • Syllogism: Verbal Reasoning Questions and Answers
  • Cubes: Verbal Reasoning Questions and Answers
  • Seating Arrangement : Aptitude Questions and Answers
  • Direction Sense test
  • Data Sufficiency in Logical Reasoning

Logical Reasoning _ Non-Verbal Reasoning

  • Mirror Image: Verbal Reasoning
  • Picture Analogies Questions - Non Verbal Reasoning

Logical Reasoning involves the ability to use and understand logical connections between facts or ideas.

  • In verbal reasoning , questions are expressed in words or statements and require the reader to think critically about the language used in order to choose the correct answer from the given options.
  • Non-verbal reasoning meanwhile involves questions presented as images and figures, requiring the reader to comprehend how one element relates to another before selecting the right answer out of a list of potential answers.

Logical Reasoning is a key component of many competitive and reasoning ability-testing exams in India and abroad. Reasoning questions allow organizations to assess a candidate’s problem-solving skills, critical thinking capabilities, and capacity for logical and analytical thinking. 

Aptitude Questions such as Quantitative Aptitude and Logical Reasoning are considered essential skills for success in a wide range of competitive exams worldwide. These two sections often form the backbone of entrance exams, whether it’s for a public sector job in India or a university admission test in the United States.

Logical Reasoning

Go through the following article to learn more about the various types of reasoning ability queries generally included in competitive tests.

Logical Reasoning Topics

Logical Reasoning is a crucial section in various competitive exams, and aspirants must study these topics to improve their problem-solving abilities and score better.

Types of Questions included in logical reasoning:

  • Verbal Questions
  • Puzzle Questions
  • Image-Based Questions
  • Sequence Questions

Topic-wise practice questions on logical reasoning:

  • Number Series
  • Letter and Symbol Series
  • Verbal Classification
  • Essential Part
  • Artificial Language
  • Matching Definitions
  • Making Judgments
  • Logical Problems
  • Logical Games
  • Analyzing Arguments
  • Course of Action
  • Statement and Conclusion
  • Theme Detection
  • Cause and Effect
  • Statement and Argument
  • Logical Deduction
  • Letter Series
  • Verification of the Truth of the Statement
  • Coding Decoding
  • Assertion and Reason
  • Statement and Assumptions
  • Logical Venn Diagram

Verbal Reasoning

Verbal reasoning is the cognitive ability to understand and interpret information presented in written or spoken language and apply logical reasoning to draw conclusions and solve problems.

It involves analyzing and evaluating information, making inferences and deductions, and identifying relationships between concepts and ideas. Verbal reasoning often tests a candidate’s language comprehension, critical thinking, and analytical skills and is commonly used in aptitude tests, job interviews, and higher education admissions.

A strong grasp of verbal reasoning can help individuals communicate effectively, think critically, and make informed decisions in their personal and professional lives.

Verbal Reasoning Questions and Answers Topics

  • Logical Sequence of Words
  • Blood Relation Test
  • Series Completion
  • Cube and Cuboid
  • Seating Arrangement
  • Character Puzzles
  • Direction Sense Test
  • Classification
  • Data Sufficiency
  • Arithmetic Reasoning
  • Verification of Truth

Non-Verbal Reasoning

Non-verbal reasoning is the cognitive ability that involves questions presented as images and figures, requiring the reader to comprehend how one element relates to another before selecting the right answer out of a list of potential answers.

Non-verbal reasoning often tests a candidate’s ability to think creatively, solve problems, and make quick decisions, and is commonly used in aptitude tests, job interviews, and higher education admissions.

A strong grasp of non-verbal reasoning can help individuals develop their creativity, spatial awareness, and problem-solving abilities, making them more effective at tackling complex challenges in their personal and professional lives.

If you are a government exam aspirant or a student preparing for college placements, the reasoning is the topic that you need to practice thoroughly. Below are some topics that need to be practiced well for the reasoning section of the exam. So, let’s go through the following article to learn more about the various types of reasoning queries generally included in competitive tests.

Non-Verbal Reasoning Questions and Answers Topics

  • Analytical Reasoning
  • Mirror Images
  • Water Images
  • Embedded Images
  • Pattern Completion
  • Figure Matrix
  • Paper Folding
  • Paper Cutting
  • Rule Detection
  • Grouping of Images
  • Dot Situation
  • Shape Construction
  • Image Analysis
  • Cubes and Dice
  • Picture Analogies

Logical reasoning is an important assessment tool for a wide range of competitive examinations. Questions in this section are designed to judge a candidate’s analytical and logical thinking abilities. Various types of reasoning questions are included in this section to test the student’s capacity for problem-solving, deduction, and inference.

Practicing questions is the only way to prepare for the reasoning test section. This way, even those who may struggle in this section can have an equal chance at success during exams or applications. The article contains concepts, questions, and topics of the reasoning section from the competitive exams and the placement exams’ point of view. 

FAQs – Logical Reasoning

Q1. what is logical reasoning  .

Logical reasoning involves the ability to use and understand logical connections between facts or ideas. The reasoning is a critical component of many tests and interviews. In order to perform well, it can be beneficial to practice doing reasoning questions with solutions available. 

Q2. What are logical reasoning questions? 

Logical reasoning questions can be both verbal and non-verbal: In verbal logical reasoning questions, questions are expressed in words or statements and require the reader to think critically about the language used in order to choose the correct answer from the given options and in non-verbal logical reasoning questions, it involves questions presented as images and figures, requiring the reader to comprehend how one element relates to another before selecting the right answer out of a list of potential answers.

Q3. What is the approach to solving reasoning questions? 

Follow the steps given below for preparation: 1. Practice with a timer and solve questions within the time limit. 2. Read the question carefully and try to understand the logic behind it. 3. Practice as many questions as you can and brush up on your skills.

Q4. Which book is good for the preparation of reasoning question sets? 

Students can practice from the following books: 1. A Modern Approach to Verbal & Non-Verbal Reasoning by R.S. Agarwal 2. Shortcuts in Reasoning (Verbal, Non-Verbal, Analytical & Critical) for Competitive Exams by Disha Experts 3. How to Crack Test of Reasoning by Arihant Experts

Q5. What is the syllabus of the Reasoning Aptitude section for competitive exams? 

Reasoning Aptitude covers a wide range of topics. Those topics are already given in the article. Aspirants must go through the article to learn about those topics and practice them thoroughly.

Please Login to comment...

  • SSC/Banking
  • deepanshusajwan1
  • tanishabutola
  • adityathjjis

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

See our features in action

How skills-based hiring works

Explore all of our integrations

Assess coding skills

Discover what drives candidates

Test thinking and problem-solving

Evaluate language proficiency

Hire industry-leading tech talent

High-volume hiring done right

Find skilled candidates fast

Our customer success stories

Expert talent assessment articles

Insights into candidate potential

Why it works and how to adopt it

logical reasoning and problem solving skills

Learn how TestGorilla uses industry-leading science to create skills-based hiring solutions.

How to assess reasoning skills

How to assess reasoning skills featured image

Identifying individuals with excellent reasoning skills is a common goal when making new hires. The ability of your employees to analyze information, think critically, and draw logical conclusions is crucial in today’s dynamic professional landscape. 

Pre-employment assessments offer great value by effectively assessing these essential capabilities. 

TestGorilla’s assessments objectively gauge a candidate’s ability to solve problems, evaluate arguments, and draw logical inferences. By leveraging these assessments, you can secure candidates with the cognitive skills necessary for analytical thinking and decision-making.

Table of contents

What is a reasoning skills assessment, why are reasoning skills important, what skills and traits do employees with good reasoning have, tests for evaluating reasoning skills, how testgorilla can help you find candidates with reasoning skills.

A reasoning skills assessment is a valuable tool that can provide insights into a candidate’s ability to analyze information, think critically, and make logical deductions. 

This assessment aims to evaluate an individual’s cognitive skills related to problem-solving, decision-making, and analytical thinking.

There are several types of cognitive ability tests that can aid in assessing reasoning. During a reasoning skills assessment, candidates are presented with various scenarios, questions, or problems that require them to apply logical thinking and problem-solving techniques. 

It can involve evaluating arguments, identifying patterns, making inferences, or solving puzzles. 

Assessments often use standardized tests or exercises that measure different aspects of reasoning. They’re designed to objectively evaluate a candidate’s cognitive abilities rather than simply relying on qualifications or experience. 

Using a reasoning skills assessment, you can make more informed decisions about a candidate’s aptitude for sound reasoning, problem-solving, and decision-making.

Why are legal assistant skills important graphic list

Effective problem-solving

Employees with solid reasoning skills can tackle complex problems with clarity and efficiency. They can analyze information, identify patterns, and make logical connections, enabling them to devise smart ways to meet challenges. 

Their problem-solving ability enhances productivity, streamlines work processes, and drives continuous organizational improvement. This is why you need analytic skills testing in your hiring process if you want to find the best candidates.

Quality decision-making

Reasoning skills contribute to effective decision-making. Employees who think critically and can logically evaluate information are more likely to make informed decisions based on evidence and careful analysis. 

Their ability to weigh options, consider potential outcomes, and anticipate risks helps mitigate errors.

Adaptability and flexibility

Individuals who can think critically and analyze situations from different angles are better equipped to embrace new challenges, adjust their approach, and find new strategies. 

Their adaptability fosters resilience, enabling them to thrive in fast-paced industries and contribute to organizational growth and success.

Enhanced innovation

Reasoning skills are at the core of innovative thinking. Employees who excel in reasoning can identify gaps, find opportunities, and connect seemingly unrelated ideas or concepts. 

Their ability to analyze data, draw logical conclusions, and come up with creative new tactics drives innovation. Hiring individuals with superb reasoning skills encourage the development of new groundbreaking ideas.

Effective risk management

Employees with exemplary reasoning abilities can evaluate potential risks, weigh their impact, and consider mitigation strategies. 

Their ability to anticipate challenges and make calculated decisions reduces the likelihood of costly errors or setbacks, contributing to effective risk management within your organization.

Continued learning and growth

People with great reasoning skills tend to be lifelong learners. They have a natural curiosity and a desire to expand their knowledge and skills. 

Their ability to think critically and adapt enables them to embrace new information, learn from experiences, and grow professionally. 

Effective communication and collaboration

Employees with reasoning skills can think critically and express their ideas clearly. They can engage in meaningful discussions, contribute valuable insights, and articulate their viewpoints. 

They can also understand and respect diverse perspectives, leading to enhanced teamwork, collaboration, and the generation of new, exciting courses of action through collective intelligence.

Critical thinking

Individuals with good reasoning skills demonstrate strong critical thinking abilities. They can analyze information objectively, evaluate arguments, and identify logical inconsistencies. 

Their critical thinking skills enable them to approach problems and challenges with a logical and rational mindset, enabling them to make sound decisions and solve complex issues effectively.

Problem-solving aptitude

Excellent reasoning skills often go hand in hand with exceptional problem-solving aptitude. Candidates who excel in reasoning can break down complex problems into manageable components, identify patterns, and come up with innovative new strategies. 

They exhibit a natural curiosity, a willingness to explore different approaches, and the ability to think outside the box, enabling them to overcome obstacles and find creative resolutions.

Analytical thinking

A key trait of individuals with good reasoning skills is their ability to think analytically. They can dissect complex information, identify key components, and draw connections between various data points. 

With their analytical thinking skills, they can examine data objectively, discern trends or patterns, and make informed decisions based on evidence and logical deductions.

Logical reasoning

Strong reasoning skills are often indicative of individuals who possess logical reasoning abilities. They can follow sequences, identify cause-and-effect relationships, and draw conclusions based on deductive or inductive reasoning. 

Their logical reasoning skills enable them to evaluate options, anticipate potential outcomes, and choose the most appropriate course of action.

Flexible thinking

Employees with good reasoning skills often exhibit cognitive flexibility. They can adapt their thinking and approach to different situations, incorporating new information and adjusting their perspectives as needed. 

Their cognitive flexibility lets them consider multiple viewpoints, explore alternative options, and navigate complex challenges with an open mind. They re-evaluate assumptions and revise their thinking based on new insights or evidence.

Communication skills

For reasoning skills to be effective in the workplace, communication is key. It’s important that employees can articulate their thoughts clearly, present logical arguments, and express complex ideas in a concise manner. 

The ability to communicate effectively helps to convey the reasoning process, engage in meaningful discussions, and collaborate with others, fostering better teamwork and understanding within the organization. 

Workplace communication tests can evaluate candidates’ ability to communicate at work.

Individuals with good reasoning skills demonstrate a natural curiosity and a thirst for continuous learning. 

They have a genuine interest in expanding their knowledge, exploring new ideas, and seeking out information to enhance their understanding. 

Their curiosity drives them to stay updated on industry trends, engage in self-improvement, and continuously develop their reasoning abilities.

When it comes to assessing a candidate’s reasoning skills, it’s important to delve deeper beyond surface-level observations. Understanding their critical thinking, problem-solving, and decision-making abilities is crucial. That’s where TestGorilla can lend a hand. 

Our extensive test library is a treasure trove of options to suit your needs. You can mix and match tests to create an assessment that aligns perfectly with your company’s requirements. 

Whether you’re searching for top-notch analysts or logical thinkers who thrive in challenging situations, our tests can help you discover exceptional candidates with the cognitive skills to excel.

Here are some of our most popular tests for assessing reasoning skills:

Critical Thinking test

At TestGorilla, we understand the significance of this test in evaluating a candidate’s ability to analyze information, make logical connections, and approach problems from multiple perspectives.

By incorporating the Critical Thinking test into your reasoning skills assessment, you gain valuable insights into an individual’s cognitive abilities and capacity to think critically in real-world scenarios. 

This test goes beyond simple memorization or rote learning; it assesses how candidates can apply their knowledge, reason through complex situations, and arrive at sound conclusions.

Verbal Reasoning test

Our Verbal Reasoning test is essential because it assesses language comprehension, critical thinking, and problem-solving abilities. It evaluates an individual’s capacity to understand written information and draw logical conclusions. 

This test also indirectly measures language proficiency and communication skills. Verbal reasoning tests are widely used because they predict academic and occupational success, and they provide a fair and accessible assessment tool for individuals from diverse backgrounds.

Spatial Reasoning test

TestGorilla’s Spatial Reasoning test assesses a candidate’s capacity to perceive and understand spatial relationships, shapes, and patterns. 

This skill is particularly relevant in fields such as engineering, architecture, design, and logistics, where professionals often encounter complex spatial problems. 

The Spatial Reasoning test also assesses a candidate’s capacity to mentally visualize and manipulate objects in space. These abilities are essential for tasks that involve spatial planning, such as interpreting maps, organizing physical spaces, or understanding 3D models. 

Candidates who perform well in spatial reasoning tests demonstrate a heightened ability to think ahead, anticipate outcomes, and develop effective strategies based on spatial information. 

Numerical Reasoning test

The Numerical Reasoning test provides valuable insights into a job candidate’s reasoning skills, particularly in terms of quantitative analysis, problem-solving, and logical thinking. 

By assessing a candidate’s proficiency in interpreting numerical data and making accurate deductions, this test assists you in identifying those who possess the numerical acumen necessary for roles involving financial analysis, data-driven decision-making, and problem-solving using quantitative methods.

Mechanical Reasoning test

While not all job roles require mechanical reasoning, this test is pertinent for positions that involve machinery, engineering, or technical operations by providing crucial insights into a candidate’s reasoning abilities in these areas.

The Mechanical Reasoning test evaluates a candidate’s understanding of mechanical principles and ability to apply that knowledge to solve problems. 

This test presents candidates with scenarios and questions that require them to analyze mechanical systems, interpret diagrams, and make logical deductions.

Problem Solving test

Problem-solving tests evaluate a candidate’s aptitude for analyzing issues from different perspectives, breaking them down into manageable components, and applying logical reasoning to reach effective resolutions. 

The Problem Solving test measures a candidate’s ability to think critically, make sound judgments, and adapt their problem-solving approach as necessary. 

Strong problem-solving skills are not limited to specific industries or job roles; they are highly transferable and valuable across various fields, including business, technology, healthcare, and customer service.

Attention to Detail (Textual) test

TestGorilla’s Attention to Detail (Textual) test offers valuable insights into a job candidate’s reasoning skills, particularly in assessing their ability to analyze and comprehend written information with precision and accuracy.

In most professional settings, the ability to pay close attention to detail is paramount. The Attention to Detail (Textual) test assesses a candidate’s proficiency in reading, comprehending, and scrutinizing written information, ensuring accuracy and completeness.

Big 5 (OCEAN) test

Reasoning skills are not solely dependent on cognitive abilities but are also influenced by an individual’s personality traits. 

The Big 5 (OCEAN) test assesses a candidate’s personality dimensions, providing a deeper understanding of their approach to challenges, level of openness to new ideas, organizational skills, propensity for collaboration, and emotional stability. 

For example, candidates with a high score in Conscientiousness demonstrate meticulous attention to detail and a structured approach to problem-solving, while those who get a high score in Openness exhibit creativity and a willingness to explore new ways of moving forward. 

By considering these traits alongside reasoning skills, you can gain a comprehensive understanding of a candidate’s potential to excel in tasks requiring critical thinking and reasoning.

Understanding Instructions test

The Understanding Instructions test plays a useful role in evaluating a job candidate’s reasoning skills, specifically their ability to understand and execute tasks based on given instructions accurately. 

This test focuses on assessing an individual’s attention to detail, critical thinking, and capacity to analyze and interpret instructions. 

It offers valuable insights into a candidate’s logical reasoning, problem-solving skills, and potential for success in roles that require close adherence to guidelines.

If you’re looking to identify candidates with exceptional reasoning skills, TestGorilla is here to support your hiring journey. With our extensive range of scientifically designed tests, we provide you with a powerful tool to assess and evaluate critical thinking and problem-solving abilities. 

By incorporating TestGorilla’s assessments into your hiring process, you’ll gain valuable insights into each candidate’s capacity to analyze, strategize, and make informed decisions, setting the stage for building a team of exceptional talent.

At TestGorilla, we understand that finding individuals who can think critically and adapt to complex challenges is crucial for your company’s success. Our tests are carefully crafted to gauge candidates’ logical reasoning, analytical skills, and cognitive abilities, giving you a comprehensive understanding of their reasoning prowess. 

By relying on TestGorilla’s innovative assessment platform, you can confidently identify top-tier candidates who will contribute fresh perspectives, creativity, and ingenuity to your organization.

Let us help you identify candidates with the critical thinking, problem-solving, and decision-making abilities your company needs to thrive.

Sign up for TestGorilla’s free plan today and experience the power of our reasoning skills assessments firsthand.

Related posts

How to support INTP personality types in the workplace featured image

How to support INTP personality types in the workplace

Ocean Outdoor UK Case Study Feature Image

Ocean Outdoor UK revamps its hiring and screens candidates faster using TestGorilla

How to write a QuickBooks bookkeeper job description featured image

How to write a QuickBooks bookkeeper job description

Hire the best candidates with testgorilla..

Create pre-employment assessments in minutes to screen candidates, save time, and hire the best talent.

logical reasoning and problem solving skills

Latest posts

How to support ENTP personality types in the workplace  featured image

The best advice in pre-employment testing, in your inbox.

No spam. Unsubscribe at any time.

Hire the best. No bias. No stress.

Our screening tests identify the best candidates and make your hiring decisions faster, easier, and bias-free.

Free resources

logical reasoning and problem solving skills

Anti-cheating checklist

This checklist covers key features you should look for when choosing a skills testing platform

logical reasoning and problem solving skills

Onboarding checklist

This resource will help you develop an onboarding checklist for new hires.

logical reasoning and problem solving skills

How to find candidates with strong attention to detail

How to assess your candidates' attention to detail.

logical reasoning and problem solving skills

How to get HR certified

Learn how to get human resources certified through HRCI or SHRM.

logical reasoning and problem solving skills

Improve quality of hire

Learn how you can improve the level of talent at your company.

logical reasoning and problem solving skills

Case study: How CapitalT reduces hiring bias

Learn how CapitalT reduced hiring bias with online skills assessments.

logical reasoning and problem solving skills

Resume screening guide

Learn how to make the resume process more efficient and more effective.

logical reasoning and problem solving skills

Important recruitment metrics

Improve your hiring strategy with these 7 critical recruitment metrics.

logical reasoning and problem solving skills

Case study: How Sukhi reduces shortlisting time

Learn how Sukhi decreased time spent reviewing resumes by 83%!

logical reasoning and problem solving skills

12 pre-employment testing hacks

Hire more efficiently with these hacks that 99% of recruiters aren't using.

logical reasoning and problem solving skills

The benefits of diversity

Make a business case for diversity and inclusion initiatives with this data.

logical reasoning and problem solving skills

Cambridge University Faculty of Mathematics

Or search by topic

Number and algebra

  • The Number System and Place Value
  • Calculations and Numerical Methods
  • Fractions, Decimals, Percentages, Ratio and Proportion
  • Properties of Numbers
  • Patterns, Sequences and Structure
  • Algebraic expressions, equations and formulae
  • Coordinates, Functions and Graphs

Geometry and measure

  • Angles, Polygons, and Geometrical Proof
  • 3D Geometry, Shape and Space
  • Measuring and calculating with units
  • Transformations and constructions
  • Pythagoras and Trigonometry
  • Vectors and Matrices

Probability and statistics

  • Handling, Processing and Representing Data
  • Probability

Working mathematically

  • Thinking mathematically
  • Mathematical mindsets
  • Cross-curricular contexts
  • Physical and digital manipulatives

For younger learners

  • Early Years Foundation Stage

Advanced mathematics

  • Decision Mathematics and Combinatorics
  • Advanced Probability and Statistics

Published 2014 Revised 2021

Using NRICH Tasks to Develop Key Problem-solving Skills

In her article Developing Excellence in Problem Solving with Young Learners , Jennie Pennant suggests that as teachers we can help children get better at problem solving in three main ways, one of which is through 'explicitly and repeatedly providing children with opportunities to develop key problem-solving skills'. This article builds on Jennie's.  In particular, it explains what we mean by 'problem-solving skills' and aims to give further guidance on how we can help learners to develop these skills by highlighting relevant NRICH tasks.  

What do we mean by 'problem-solving skills'?

In the aforementioned article, Jennie outlines four stages of the problem-solving process:

By explicitly drawing children's attention to these four stages, and by spending time on them in turn, we can help children become more confident problem solvers.  Jennie outlines different ways in which learners might get started on a task (stage 1), but it is once they have got going and are working on the problem (stage 2) that children will be making use of their problem-solving skills. Here are some useful problem-solving skills:

  • Trial and improvement
  • Working systematically (and remember there will be more that one way of doing this: not just the one that is obvious to you!)
  • Pattern spotting
  • Working backwards
  • Reasoning logically
  • Visualising
  • Conjecturing

The first two in this list are perhaps particularly helpful.  As learners progress towards a solution, they may take the mathematics further (stage 3) and two more problem-solving skills become important:

  • Generalising

Having reached a solution, stage 4 of the process then involves children explaining their findings and reflecting on different methods used. For the purposes of this article, we will think of 'problem-solving skills' as  those skills that learners need in order to work on the mathematics of a task, during stages 2 and 3 of the problem-solving process.  

How can we help children get better at these problem-solving skills?

logical reasoning and problem solving skills

How can procedural flowcharts support the development of mathematics problem-solving skills?

  • Original Article
  • Open access
  • Published: 22 February 2024

Cite this article

You have full access to this open access article

  • Musarurwa David Chinofunga   ORCID: orcid.org/0000-0002-0262-3039 1 ,
  • Philemon Chigeza   ORCID: orcid.org/0000-0001-9964-0988 1 &
  • Subhashni Taylor   ORCID: orcid.org/0000-0002-1624-0901 1  

28 Accesses

1 Altmetric

Explore all metrics

Supporting students’ problem-solving skills, solution planning and sequencing of different stages that are involved in successfully developing a meaningful solution to a problem has been a challenge for teachers. This case study was informed by reflective investigation methodology which explored how procedural flowcharts can support student mathematics problem solving in a senior Mathematical Methods subject in Queensland. The paper used thematic analysis to analyse and report on teachers’ perceptions of the utility of procedural flowcharts during problem solving as well as content analysis on how student-developed flowcharts can support their problem-solving skills. Results show that development of procedural flowcharts can support problem solving as it helps with integration of problem-solving stages.

Similar content being viewed by others

logical reasoning and problem solving skills

Identifying Students’ Misconceptions on Basic Algorithmic Concepts Through Flowchart Analysis

Students’ individual schematization pathways - empirical reconstructions for the case of part-of-part determination for fractions.

Matthias Glade & Susanne Prediger

logical reasoning and problem solving skills

Identifying Metacognitive Behavior in Problem-Posing Processes

Lukas Baumanns & Benjamin Rott

Avoid common mistakes on your manuscript.

Introduction

Problem solving is central to teaching and learning of mathematics (see Cai, 2010 ; Lester, 2013 ; Schoenfeld et al., 2014 ). For decades, research in mathematics problem solving, including special issues from leading mathematics education journals (see, Educational Studies in Mathematics, (Vol. 83, no. 2013); The Mathematics Enthusiast, (Vol. 10, nos. 1–2); ZDM , (Vol. 39, nos. 5–6)), have offered significant insights but struggled to produce well-articulated guidelines for educational practice (English & Gainsburg, 2016 ). This could possibly be the reason why mathematics teachers’ efforts to improve students’ problem-solving skills have not produced the desired results (Anderson, 2014 ; English & Gainsburg, 2016 ). Despite Polya’s ( 1945 ) heuristics being so valuable in problem solving, there appears to be limited success when translated into the classroom environment (English & Gainsburg, 2016 ). English and Gainsburg went further to posit that one of the issues to be addressed is how to support problem-solving competency in students during the process of problem solving. Thus, teachers’ perceptions in this study are a valuable part in evaluating how procedural flowcharts can support problem solving.

The problem-solving process is a dialogue between the prior knowledge the problem solver possesses, the tentative plan of solving the problem and other relevant thoughts and facts (Schoenfeld, 1983 ). However, research is still needed on tools that teachers can use to support students during problem solving (Lester & Cai, 2016 ). Although research in mathematics problem solving has been progressing, it has remained largely theoretical (Lester, 2013 ). Schoenfeld ( 2013 ) suggests that research focus should now advance from the framework for examining problem solving to explore how ideas grow and are presented and shared during the problem-solving process. Recently, Kaitera and Harmoinen ( 2022 ) emphasised the need to support teachers through resources that can help students develop problem solving skills. They went on to posit that resources that can assist students in presenting different approaches to a solution and displaying their understanding are critical to build their problem-solving skills.

The study by Kaitera and Harmoinen ( 2022 ) introduced mathematics students to ‘problem-solving keys’ which are heuristics for problem solving that students are to follow as they engage with tasks. Their conclusion was also noted by Vale and Barbosa ( 2018 ) who observed that a key area that would benefit from further research is the identification of strategies or plan that support students’ ability to construct and present their mathematical knowledge effectively during problem solving, particularly if complex processes such as integration and modification of several procedures are involved. Similarly, students face challenges in connecting or bringing all the ideas together and showing how they relate as they work towards the solution (Reinholz, 2020 ). Problem solving in mathematics is challenging for students (Ahmad et al., 2010 ), and therefore, supporting students’ problem-solving skills needs urgent attention (Schoenfeld, 2016 ). Furthermore, Mason ( 2016 ) posits that the crucial yet not significantly understood issue for adopting a problem-solving approach to teaching is the issue of “when to introduce explanatory tasks, when to intervene and in what way” (p. 263). Therefore, teachers also need resources to support the teaching of problem-solving skills, often because they were not taught these approaches when they were school students (Kaitera & Harmoinen, 2022 ; Sakshaug & Wohlhuter, 2010 ).

Flowcharts have been widely used in problem solving across different fields. In a technology-rich learning environment such as Lego Robotics, creating flowcharts to explain processes was observed to facilitate understanding, thinking, making sense of how procedures relate, investigating and communicating the solution (Norton et al., 2007 ). They are effective in guiding students during problem solving (Gencer, 2023 ), enhancing achievement and improving problem-solving skills in game-based intelligent tutoring (Hooshyar et al., 2016 ). Flowcharts have been identified as an effective problem-solving tool in health administration (McGowan & Boscia, 2016 ). In mathematics education, heuristic trees and flowcharts were observed to supplement each other in influencing students’ problem solving behaviour (Bos & van den Bogaart, 2022 ). Importantly, McGowan and Boscia emphasised that “one of the greatest advantages of a flowchart is its ability to provide for the visualisation of complex processes, aiding in the understanding of the flow of work, identifying nonvalue-adding activities and areas of concern, and leading to improved problem-solving and decision-making” (p. 213). Identifying the most appropriate strategy and making the correct decision at the right stage are keys to problem solving. Teaching students to use visual representations like flowcharts as part of problem solving supports the ability to easily identify new relationships among different procedures and assess the solution being communicated faster as visual representations are more understandable (Vale et al., 2018 ).

The purpose of this case study was to explore, through an in-depth teacher’s interview, and student-developed artefacts, the utility of procedural flowcharts in supporting the development of students’ problem-solving skills in mathematics. The study will focus on problem solving in Mathematical Methods which is one of the calculus-based mathematics subjects at senior school in Queensland. The aim was to investigate if the development of procedural flowcharts supported students in planning, logically connecting and integrating mathematical procedures (knowledge) and to communicate the solution effectively during problem solving. The use of flowcharts in this study was underpinned by the understanding that visual aids that support cognitive processes and interlinking of ideas and procedures influence decision-making, which is vital in problem-based learning (McGowan & Boscia, 2016 ). Moreover, flowcharts are effective tools for communicating the processes that need to be followed in problem solving (Krohn, 1983 ).

Problem-solving learning in mathematics education

The drive to embrace a problem-solving approach to develop and deepen students’ mathematics knowledge has been a priority in mathematics education (Koellner et al., 2011 ; Sztajn et al., 2017 ). In the problem-solving approach, the teacher provides the problem to be investigated by students who then design ways to solve it (Colburn, 2000 ). To engage in problem solving, students are expected to use concepts and procedures that they have learnt (prior knowledge) and apply them in unfamiliar situations (Matty, 2016 ). Teachers are encouraged to promote problem-solving activities as they involve students engaging with a mathematics task where the procedure or method to the solution is not known in advance (National Council of Teachers of Mathematics [NCTM], 2000 ), thus providing opportunities for deep understanding as well as providing students with the opportunity to develop a unique solution (Queensland Curriculum and Assessment Authority [QCAA], 2018 ). Using this approach, students are given a more active role through applying and adapting procedures to solve a non-routine problem and then communicating the method (Karp & Wasserman, 2015 ). The central role problem solving plays in developing students’ mathematical understanding has resulted in the development of different problem-solving models over the years.

The process of problem solving in mathematics requires knowledge to be organised as the solution is developed and then communicated. Polya is among the first to systematise problem solving in mathematics (Voskoglou, 2021 ). Students need to understand the problem, plan the solution, execute the plan and reflect on the solution and process (Polya, 1971 ). Voskoglou’s ( 2021 ) problem-solving model emphasised that the process of modelling involves analysis of problem, mathematisation, solution development, validation and implementation. Similarly, problem solving is guided by four phases: discover, devise, develop and defend (Makar, 2012 ). During problem solving, students engage with an unfamiliar real-world problem, develop plans in response, justify mathematically through representation, then evaluate and communicate the solution (Artigue & Blomhøj, 2013 ). Furthermore, Schoenfeld ( 1980 ) posited that problem solving involves problem analysis, exploration, design, implementation and verification of the solution. When using a problem-solving approach, students can pose questions, develop way(s) to answer problems (which might include drawing diagrams, carrying out calculations, defining relationships and making conclusions), interpreting, evaluating and communicating the solution (Artigue et al., 2020 ; Dorier & Maass, 2020 ). Problem solving involves understanding the problem, devising and executing the plan and evaluating (Nieuwoudt, 2015 ). Likewise, Blum and Leiß ( 2007 ) developed a modelling approach that was informed by these stages, understanding, simplifying, mathematising, working mathematically, interpreting and validating.

Similarly, mathematical modelling involves problem identification from a contextualised real-world problem, linking the solution to mathematics concepts, carrying out mathematic manipulations, justifying and evaluating the solution in relation to the problem and communicating findings (Geiger et al., 2021 ). Likewise, in modelling, Galbraith and Stillman ( 2006 ) suggested that further research is needed in fostering students’ ability to transition effectively from one phase to the next. “Mathematical modelling is a special kind of problem solving which formulates and solves mathematically real-world problems connected to science and everyday life situations” (Voskoglou, 2021 p. 85). As part of problem solving, mathematical modelling requires students to interpret information from a variety of narrative, expository and graphic texts that reflect authentic real-life situations (Doyle, 2005 ).

There are different approaches to problem solving and modelling, but all of them focus on the solving of real-world problems using mathematical procedures and strategies (Hankeln, 2020 ). A literature synthesis is critical where several models exist as it can be used to develop an overarching conceptual model (Snyder, 2019 ). Torraco ( 2005 ) noted that literature synthesis can be used to integrate different models that address the same phenomenon. For example, in this study, it was used to integrate problem solving models cited in the literature. Moreover, the review was necessitated by the need to reconceptualise the problem-solving model by Polya ( 1971 ) to include the understanding that the definition of problem solving has now broadened to include modelling. Torraco went further to suggest that as literature grows, and knowledge expands on a topic which might accommodate new insights, there is a need for literature synthesis with the aim to reflect the changes. Thus, the model in Fig.  1 took into consideration the key stages broadly identified by the researchers and the understanding that modelling is part of problem solving. Problem solving and modelling is generally a linear process that can include loops depending on how the problem identification, mathematisation and implementation effectively address the problem (Blum & Leiß, 2007 ; Polya, 1957 ).

figure 1

Stages of mathematics problem solving

Figure  1 identifies the main stages that inform mathematics problem solving from the literature.

Problem identification and the design to solve the problem might be revisited if the procedures that were identified and their mathematical justification do not address the problem. Likewise, justification and evaluation after implementation might prompt the problem solver to realise that the problem was incorrectly identified. The loop is identified by the backward arrow, and the main problem-solving stages are identified by the linear arrows. The Australian Curriculum, Assessment and Reporting Authority notes that during problem solving:

Students solve problems when they use mathematics to represent unfamiliar situations, when they design investigations and plan their approaches, when they apply their existing knowledge to seek solutions, and when they verify that their answers are reasonable. Students develop the ability to make choices, interpret, formulate, model and investigate problem situations, and communicate solutions effectively. (Australia Curriculum and Reporting Authority, 2014 , p. 5)

Therefore, during problem solving, students have to plan the solution to the problem and be able to communicate all the key processes involved. However, although problem solving is highly recommended in mathematics education, it presents several challenges for teachers in terms of how they can best support students to connect the processes and mathematics concepts into something coherent that can lead to a meaningful solution (Hacker, 1998 ). Therefore, relevant tools that support problem solving and decision-making can make a difference for both mathematics teachers and students (McGowan & Boscia, 2016 ).

Students can solve problems better if they can think critically (Kules, 2016 ). Problem solving requires their active engagement in analysing, conceptualising, applying concepts, evaluating, comparing, sequencing, synthesising, reasoning, reflecting and communicating, which are skills that are said to promote critical thinking (Kim et al., 2012 ; King, 1995 ; Moon, 2008 ; QCAA, 2018 ). Similarly, the ability to undertake problem solving is supported when students are provided with the opportunity to sequence ideas logically and evaluate the optimal strategy to solve the problem (Parvaneh & Duncan, 2021 ). However, finding tools that can support problem solving has been a focus for researchers for a long time but with very limited breakthroughs (McCormick et al., 2015 ). This study explored how procedural flowcharts as visual representations can support students in organising ideas, execute procedures, justify solutions and communicate their solution.

Importance of visual representations in mathematics problem-solving

Research on how visual representations support mathematics discovery and structural thinking in problem solving has come a long way (see Hadamard, 1945 ; Krutetskii, 1976 ; Polya, 1957 ). Visual representations are classified as graphs, tables, maps, diagrams, networks and icons and are widely used to convey information in a recognisable form that can be easily interpreted without resorting to tedious computations (Lohse et al., 1994 ). Visual representations can be used as a tool to capture mathematics relations and processes (van Garderen et al., 2021 ) and used in many cognitive tasks such as problem solving, reasoning and decision making (Zhang, 1997 ). Indeed, representations can be modes of communicating during concepts exploration and problem solving (Roth & McGinn, 1998 ). Likewise, visual representations can be a powerful way of presenting the solution to a problem, including self-monitoring on how the problem is being solved (Kingsdorf & Krawec, 2014 ; Krawec, 2014 ). Using visualisations created by teachers or students in mathematics can support students’ problem-solving abilities (Csíkos et al., 2012 ).

Visual representations show thoughts in non-linguistic format, which is effective for communication and reflection. “Visual representations serve as tools for thinking about and solving problems. They also help students communicate their thinking to others” (NCTM, 2000 , p. 206). In mathematics, visual representation plays a significant role in showing the cognitive constructs of the solution (Owens & Clements, 1998 ), a view echoed by Arcavi ( 2003 ), who said that visual representations can be appreciated as a central part of reasoning and as a resource to use in problem solving. More importantly, they can be used to represent the logical progression of ideas and reasoning during problem solving (Roam, 2009 ). However, there is need to explore how visual representations can be used to support and illustrate the problem-solving process and to create connections among concepts (Stylianou, 2010 ). Importantly, developing diagrams is often a recommended strategy for solving mathematics problems (Pape & Tchoshanov, 2001 ; Jitendra et al., 2013 ; Zahner & Corter, 2010 ). Therefore, this study will explore the utility of procedural flowcharts as a visual representation and resource in supporting problem analysis, problem understanding, solution development and evaluation, while communicating the whole problem-solving process effectively. It will go further to explore how development of procedural flowcharts can support educational practice in Mathematical Methods subject.

Procedural flowcharts are a visual representation of procedures, corresponding steps and stages of evaluation of a solution to a problem (Chinofunga et al., 2022 ). These authors noted that procedural flowcharts developed by the teacher can guide students during the inquiry process and highlight key procedures and stages for decision-making during the process of problem solving. This is because “a procedural flowchart graphically displays the information–decision–action sequences in the proposed order” (Krohn, 1983 , p. 573). Similarly, Chinofunga and colleagues ( 2022 ) emphasised that procedural flowcharts can be used to visually represent procedural flexibility as more than one procedure can be accommodated, making it easier to compare the effectiveness of different procedures as they are being applied. They further posited that student-developed procedural flowcharts provide students with the opportunity to comprehensively engage with the problem and brainstorm different ways of solving it, thus deepening their mathematics knowledge. Moreover, a procedural flowchart can be a visual presentation of an individual or group solution during problem solving.

Research has identified extended benefits of problem solving in small groups (Laughlin et al., 2006 ). Giving groups an opportunity to present a solution visually can be a quicker way to evaluate a group solution because visual representations can represent large amounts of information (even from different sources) in a simple way (Raiyn, 2016 ). Equally, Vale and colleagues encouraged visual representation of solutions with multisolutions as a tool to teach students problem solving ( 2018 ). Therefore, students can be asked to develop procedural flowcharts individually then come together to synthesise different procedural flowcharts.

Similarly, flowcharts are a visual aid used to represent how procedures interrelate and function together. “They are tools to visually break down complex information into individual building blocks and how the blocks are connected” (Grosskinsky et al., 2019 , p. 24). They outlay steps in a procedure and show how they can be applied, thus helping to visualise the process (Ledin & Machin, 2020 ; Reingewertz, 2013 ). Flowcharts can also be used when a logical and sequenced approach is needed to address a problem (Cantatore & Stevens, 2016 ). Importantly, in schools, Norton and colleagues ( 2007 ) noted that “planning facilitated through the use of flow charts should be actively encouraged and scaffolded so that students can appreciate the potential of flow charts to facilitate problem-solving capabilities” (p. 15). This was because the use of flowcharts in problem solving provided a mental representation of a proposed approach to solve a task (Jonassen, 2012 ). The success of flowcharts in problem solving in different fields can be attributed to their ability to facilitate deep engagement in planning the solution to the problem.

Flowcharts use has distinct advantages that can benefit problem solving. Norton and colleagues ( 2007 ) posited that using a well-planned and well-constructed flowchart in problem solving results in a good-quality solution. Furthermore, flowcharts can also be a two-way communication resource between a teacher and students or among students (Grosskinsky et al., 2019 ). These authors further noted that flowcharts can help in checking students’ progress, tracking their progress and guide them. They can also be used to highlight important procedures that students can follow during the process of problem solving.

Similarly, flowcharts can be used to provide a bigger picture of the solution to a problem (Davidowitz & Rollnick, 2001 ). Flowcharts help students gain an overall and coherent understanding of the procedures involved in solving the problem as they promote conceptual chunking (Norton et al., 2007 ). Importantly, “they may function to amplify the zone of proximal development for students by simplifying tasks in the zone” (Davidowitz & Rollnick, 2001 , p. 22). Use of flowcharts by students reduces the cognitive load which then may help them focus on more complex tasks (Berger, 1998 ; Sweller et al., 2019 ). Indeed, development of problem-solving skills can be supported when teachers introduce learning tools such as flowcharts, because they can help structure how the solution is organised (Santoso & Syarifuddin, 2020 ). Therefore, the use of procedural flowcharts in mathematics problem solving has the potential to transform the process.

The research question in this study was informed by the understanding that limited resources are available to teachers to support students’ problem-solving abilities. In addition, the literature indicates that visual representation such as procedural flowcharts can support students’ potential in problem solving. Therefore, the research described in this study addressed the following research question: What are teachers’ perceptions of how procedural flowcharts can support the development of students’ problem-solving skills in the Mathematical Methods subject?

Methodology

The case study draws from the reflective investigation methodology (Trouche et al., 2018 ,  2020 ). The methodology explores how teaching and learning was supported by facilitating a teacher’s reflection on the unexpected use of a resource, in this case procedural flowcharts. The reflective methodology emphasises a teacher’s active participation through soliciting views on the current practice and recollection on previous work (Trouche et al., 2020 ). Using the methodology, a teacher is asked to reflect on and describe the resource used, the structure (related to the activity), the implementation and the outcomes (Huang et al., 2023 ).

This case study focuses on phases three and four of a broad PhD study that involved four phases. The broad study was informed by constructivism. Firstly, phase one investigated Queensland senior students’ mathematics enrolment in different mathematics curricula options from 2010 to 2020. Secondly, phase two developed and introduced pedagogical resources that could support planning, teaching and learning of calculus-based mathematics with a special focus on functions in mathematical methods. The pedagogical resources included a framework on mathematics content sequencing which was developed through literature synthesis to guide teachers on how to sequence mathematics content during planning. Furthermore, the phase also introduced concept maps as a resource for linking prior knowledge to new knowledge in a constructivist setting. Procedural flowcharts were also introduced to teachers in this phase as a resource to support development of procedural fluency in mathematics. Importantly, a conference workshop organised by the Queensland Association of Mathematics Teachers (Cairns Region) provided an opportunity for teachers to contribute their observations on ways that concept maps and procedural flowcharts can be used to support teaching. Thirdly, phase three was a mixed-method study that focused on evaluating the pedagogical resources that were developed or introduced in phase two with 16 purposively sampled senior mathematics teachers in Queensland who had been given a full school term to use the resources in their practice. Some qualitative data collected through semistructured interviews from phase three were included in the results of the study reported here. During the analysis of the qualitative data, a new theme emerged which pointed to the unexpected use of procedural flowcharts during teaching and learning beyond developing procedural fluency. As a result, the researchers decided to explore how development of procedural flowcharts supported teaching and learning of mathematics as an additional phase. Phase four involved an in-depth interview with Ms. Simon (pseudonym) a teacher who had unexpectedly applied procedural flowcharts in a problem-solving task, which warranted further investigation. Ms. Simon’s use of procedural flowcharts was unexpected as she had used them outside the context and original focus of the broader study. Importantly, in phase four, artefacts created by the teacher and her four students in the problem-solving task were also collected.

Ms. Simon (pseudonym) had explored the use of procedural flowcharts in a problem-solving and modelling task (PSMT) in her year 11 Mathematical Methods class. This included an introduction to procedural flowcharts, followed by setting the students a task whereby they were asked to develop a procedural flowchart as an overview on how they would approach a problem-solving task. The students were expected to first develop the procedural flowcharts independently then to work collaboratively to develop and structure an alternative solution to the same task. The student-developed procedural flowcharts (artefacts) and the in-depth interview with Ms. Simon were included in the analysis. As this was an additional study, an ethics amendment was applied for and granted by the James Cook University Ethics committee, approval Number H8201, as the collection of students’ artefacts was not covered by the main study ethics approval for teachers.

Research context of phase four of the study

In the state of Queensland, senior mathematics students engage with three formal assessments (set by schools but endorsed by QCAA) in year 12 before the end of year external examination. The formal internal assessments consist of two written examinations and a problem-solving and modelling task (PSMT). The PSMT is expected to cover content from Unit 3 (Further Calculus). The summative external examination contributes 50% and the PSMT 20% of the overall final mark, demonstrating that the PSMT carries the highest weight among the three formal internal assessments.

The PSMT is the first assessment in the first term of year 12 and is set to be completed in 4 weeks. Students are given 3 h of class time to work on the task within the 4 weeks and write a report of up to 10 pages or 2000 words. The 4 weeks are divided into four check points, one per week with the fourth being the submission date. On the other three checkpoints, students are expected to email their progress to the teacher. At checkpoint one, the student will formulate a general plan on how to solve the problem which is detailed enough for the teacher to provide meaningful feedback. Checkpoint one is where this study expects teachers to provide students with the opportunity to develop a procedural flowchart of the plan to reach the solution. Importantly at checkpoint one, teachers are interested in understanding which mathematics concepts students will select and apply to try and solve the problem and how the concepts integrate or complement each other to develop a mathematically coherent, valid and appropriate solution. Moreover, teachers are expected to have provided students with opportunities to develop skills in undertaking problem-solving and modelling task before they engage with this formal internal assessment. The QCAA has provided a flowchart to guide teachers and students on how to approach a PSMT ( Appendix 1 )

Participants in phase four of the study

Ms. Simon and a group of four students were the participants in this study. Ms. Simon had studied mathematics as part of her undergraduate education degree, which set her as a highly qualified mathematics teacher. At the time of this study, she was the Head of Science and Mathematics and a senior mathematics teacher at one of the state high schools in Queensland. She had 35 years’ experience in teaching mathematics across Australia in both private and state schools, 15 of which were as a curriculum leader. She was also part of the science, technology, engineering and mathematics (STEM) state-wide professional working group. Since the inception of the external examination in Queensland in 2020, she had been an external examination marker and an assessment endorser for Mathematical Methods with QCAA. The students who were part of this study were aged between 17 and 18 years and were from Ms. Simon’s Mathematical Methods senior class. Two artefacts were from individual students, and the third was a collaborative work from the two students.

Phase four data collection

First, data were collected through an in-depth interview between the researcher and Ms. Simon. The researcher used pre-prepared questions and incidental questions arising from the interview. The questions focused on exploring how she had used procedural flowcharts in a PSMT with her students. The interview also focused on her experiences, observations, opinions, perceptions and results, comparing the new experience with how she had previously engaged her students in such tasks. The interview lasted 40 min, was transcribed and coded so as to provide evidence of the processes involved in the problem solving. Some of the pre-prepared questions were as follows:

What made you consider procedural flowcharts as a resource that can be used in a PSMT?

How have you used procedural flowcharts in PSMT?

How has the use of procedural flowcharts transformed students’ problem-solving skills?

How have you integrated procedural flowcharts to complement the QCAA flowchart on PSMT in mathematics?

What was your experience of using procedural flowcharts in a collaborative setting?

How can procedural flowcharts aid scaffolding of problem-solving tasks?

Second, Ms. Simon shared her formative practice PSMT task (described in detail below), and three of her students’ artefacts. The artefacts that she shared (with the students’ permission) were a critical source of data as they were a demonstration of how procedural flowcharts produced by students can support the development of problem solving and provided an insight into the use of procedural flowcharts in a PSMT.

Problem-solving and assessment task

The formative practice PSMT that Ms. Simon shared is summarised below under the subheadings: Scenario, Task, Checkpoints and Scaffolding.

You are part of a team that is working on opening a new upmarket Coffee Café. Your team has decided to cater for mainly three different types of customers. Those who:

Consume their coffee fast.

Have a fairly good amount of time to finish their coffee.

Want to drink their coffee very slowly as they may be reading a book or chatting.

The team has tasked you to come up with a mode or models that can be used to understand the cooling of coffee in relation to the material the cup is made from and the temperature of the surroundings.

Write a mathematical report of at most 2000 words or up to 10 pages that explains how you developed the cooling model/s and took into consideration the open cup, the material the cup was made from, the cooling time, the initial temperature of the coffee and the temperature of the surroundings.

Design an experiment that investigates the differences in the time of cooling of a liquid in open cups made from different materials. Record your data in a table.

Develop a procedural flowchart that shows the steps that you used to arrive at a solution for the problem.

Justify your procedures and decisions by explaining mathematical reasoning.

Provide a mathematical analysis of formulating and evaluating models using both mathematical manipulation and technology.

Provide a mathematical analysis that involves differentiation (rate of change) and/or anti-differentiation (area under a curve) to satisfy the needs of each category of customers.

Evaluate the reasonableness of solutions.

You must consider Newton’s Law of Cooling which states that the rate of change of the temperature of an object is proportional to the difference between its own temperature and the temperature of its surroundings. For a body that has a higher temperature than its surroundings, Newton’s Law of Cooling can model the rate at which the object is cooling in its surroundings through an exponential equation. This equation can be used to model any object cooling in its surroundings: 

y is the difference between the temperature of the body and its surroundings after t minutes,

A 0 is the difference between the initial temperature of the body and its surroundings,

k is the cooling constant.

Checkpoints

Week 1: Students provide individual data from the experiment and create a procedural flowchart showing the proposed solution to the problem. Teacher provides individual feedback. Week 2: Students provide a consolidated group procedural flowchart. Teacher provides group feedback Week 3: Students email a copy of their individually developed draft report for feedback. Week 4: Students submit individual final response in digital (PDF format) by emailing a copy to their teacher, providing a printed copy to their teacher and saving a copy in their Maths folder.

Additional requirements/instructions

The response must be presented using an appropriate mathematical genre (i.e., a mathematical report).

The approach to problem-solving and mathematical modelling must be used.

All sources must be referenced.

Data analysis

The analysis of data includes some observations and perceptions of mathematics teachers which were collected through surveys and interviews from phase three of the broader PhD study. The survey and interviews data in the broader study including phase four in-depth interview with Ms. Simon were transcribed and coded using thematic analysis (TA). TA is widely used in qualitative research to identify and describe patterns of meaning within data (Braun & Clarke, 2006 ; Ozuem et al., 2022 ). The thematic validity was ensured using theory triangulation. It involves sharing qualitative responses among colleagues at different status positions in the field and then comparing findings and conclusions (Guion et al., 2011 ). The study adopted the inductive approach which produces codes that are solely reflective of the contents of the data (Byrne, 2022 ).

Coding was done with no pre-set codes, and line-by-line coding was used as this was mainly an inductive analysis. The research team comprising of the researcher and two advisors/supervisors met to set the initial coding mechanism and code part of the data for consistency before independent coding of all the data. This is supported by King ( 2004 ) who suggested that when searching for themes, it is best to start with a few codes to help guide analysis. The data covered a wide variety of concepts, so initially the different concepts that grouped the research questions as ‘conceptual themes’ were utilised to organise the data. The research team examined the codes, checking their meaning and relationships between them to determine which ones were underpinned by a central concept. In Excel, codes that shared a core idea from the initial phase that used data from the open-ended responses and interview transcripts were colour coded. After the independent thematic analysis, the filter function in Excel was used to sort the codes using cell colour. Moreover, Excel provided the opportunity to identify duplicates as codes were collated from the three researchers. Same coloured codes were synthesised to develop a general pattern of meaning, which we referred to as candidate themes. The sorting and collation approach would bring together all codes under each theme which then would facilitate further analysis and review (Bree et al., 2014 ).

The research team went on to review the relationship of the data and the codes that informed the themes. This is supported by Braun and Clarke ( 2012 , 2021 ) who posited that researchers should conduct a recursive review of the candidate themes in relation to the coded data items and the entire dataset. During the review, whenever themes were integrated or codes were moved to another theme, a new spreadsheet was created so that if further review was necessary, the old data and layout would still be available. Importantly, if the codes form a coherent and meaningful pattern, the theme makes a logical argument and may be representative of the data (Nowell et al., 2017 ). Furthermore, the team also reviewed the themes in relation to the data. This is because Nowell and others posited that themes should provide the most accurate interpretation of the data. The research team also discussed and wrote detailed analysis for each candidate theme identifying the main story behind each theme and how each one fit into the overall story about the data through the lens of the research questions. Finally, the researchers also linked quotes to final themes reached during the analysis. Illustrating findings with direct quotations from the participants strengthen the face validity and credibility of the research (Bryne, 2022 ; Patton, 2002 ; Nowell et al., 2017 ).

Student artefacts

The students’ artefacts (procedural flowcharts) in Figs.  5 , 6 and 7 were analysed using content analysis. Content analysis can be used to analyse written, verbal or visual representations (Cole, 1988 ; Elo & Kyngäs, 2008 ). Content analysis is ideal when there is a greater need to identify critical processes (Lederman, 1991 ). Unlike interviews, documents that are ideal for qualitative analysis should be developed independently without the researcher’s involvement (Merriam & Tisdell, 2015 ). In fact, the documents should not have been prepared for the purpose of research (Hughes & Goodwin, 2014 ), hence they are a stable and discrete data source (De Massis & Kotlar, 2014 ; Merriam & Tisdell, 2015 ). The students’ artefacts used in this study were not prepared for the purpose of the study but as a mathematics task. Deductive content analysis is used when the structure of analysis is implemented on the basis of previous knowledge and the purpose of the study is model testing or confirmation (Burns & Grove, 2009 ). Similarly, it is an analytical method that aims to test existing concepts, models or hypotheses in a new context (Kyngäs et al., 2020 ). They went further to note that researchers can use deductive analysis to determine how a model fit a new context.

Deductive content analysis follows three main stages: preparation, organising and reporting (Elo et al., 2014 ; Elo & Kyngäs, 2008 ). Firstly, preparation involves identifying the unit of analysis (Guthrie et al., 2004 ). In this study, the unit of analysis are the artefacts developed by the students. Furthermore, the phase requires the researcher to be immersed in the data reading and digesting to make sense of the whole set of data through reflexivity, open-mindedness and following the rationale of what guided participants’ narratives or in developing the artefact (Azungah, 2018 ). Secondly, a categorisation matrix based on existing knowledge should be developed or identified to facilitate the coding of the data according to categories (Hsieh & Shannon, 2005 ) (Table  1 ). Importantly, when using deductive content analysis, researchers require a theoretical structure or model from which they can build an analysis matrix (Kyngäs et al., 2020 ). Finally, the analysis results should be reported in ways that promote interpretation of the data and the results, for example, in tabular form (Elo & Kyngäs, 2008 ) (Fig.  2 ).

figure 2

Stages followed during analysis of procedural flowcharts

The students’ procedural flowcharts were coded and interpreted on how they respond to different stages of problem solving. The researcher’s codes, interpretations and findings should be clearly derived and justified using the available data and then inform conclusions and interpretations for confirmability (Tobin & Begley, 2004 ). The artefacts were shared between the researcher and his supervisors; the analysis was done independently then reviewed by the researcher and his supervisors. Schreier ( 2012 ) recommended that analysis should be done by more than one person to promote thoroughness and broaden the interpretation of the data. Schreier went further to note that if the categorisation matrix is clear and of high quality, the coding should produce very little discrepancies. Very little discrepancies were observed except that some stages on the students’ procedural flowcharts overlapped between skills.

This section presents results from the analysis of the interviews data and student artefacts.

Semi-structured interviews

The thematic analysis of interviews resulted in two themes:

The utility of procedural flowcharts in supporting mathematics problem solving.

The utility of procedural flowcharts in supporting the integration of the four stages of mathematics problem solving.

In phase three, which prompted the targeted phase four study described in this study, teachers were asked the question, “How have you used procedural flowcharts to enhance teaching and learning of mathematics?” The question was not specific to problem solving but the teachers’ observations and perceptions strongly related to problem-solving and student-centred learning.

Theme 1 The utility of procedural flowcharts generally supports mathematics problem solving

The visual nature of procedural flowcharts was seen as an advantage to both teachers and students. For students, drawing a flowchart was easier than writing paragraphs to explain how they had arrived at the intended solution. For teachers, the flowchart was easier to process for timely feedback to students. Developing a procedural flowchart at the first checkpoint in the PSMT allows teachers to provide valuable feedback as the procedural flowchart can be used to represent several processes compared to written because of its visual nature. Engagement can be promoted because students can use the targeted feedback to improve their solutions as they will have provided a detailed overview of how they propose to solve the problem.

They present steps in diagrammatic form which is easy to process and easy to understand and process… students prefer them more as its in diagrammatic form and I have witnessed more students engaging. (Participant 8, phase three study) I find it (visual) a really efficient way for me to look at the proposed individual students processes and provide relevant feedback to the student or for the student to consider. And, you know, once the students are comfortable with using these procedural flowcharts you know, I find it much easier for me to give them relevant feedback, and I actually find that feedback more worthwhile than feedback we used to give them, you know, that was just based on what they wrote in paragraphs,…students get to practice in creating their own visual display, which communicates their intended strategies to solve the problem, then they have opportunities to use it, and fine tune it as they work out the problem … student developed procedural flow charts, they represent a student’s maths knowledge in a visual way. (Ms. Simon).

Identifying students’ competencies early was seen as central to successful problem solving as it provided opportunities for early intervention. Results showed that teachers viewed procedural flowcharts as a resource that could be used to identify gaps in skills, level of understanding and misconceptions that could affect successful and meaningful execution of a problem-solving task. Going through a student-developed flowchart during problem solving provided the teachers with insight into the student’s level of understanding of the problem and how the effectiveness of the procedures proposed to address the problem. This is critical for tasks that require students to develop a report detailing the solution at the end of developing the solution. Teachers can get the opportunity to gain an insight of the proposed solution before the student commit to write the report. The procedural flowchart provides the bigger picture of the solution plan which might expose gaps in knowledge.

I found it quite useful because I can identify what kids or which kids are competent in what, which sort of problem-solving skills. And I can identify misconceptions that students have or gaps in students understanding. (Participant 1, phase three study) It also to me highlights gaps in students’ knowledge in unique ways that students intend to reach a solution because the use of the procedural flow chart encourages students to explain the steps or procedures behind any mathematical manipulation that you know they're intending to use. And it's something that was much more difficult to determine prior to using procedural flow charts… I've also used you know, student developed procedural flow charts to ascertain how narrow or wide the students’ knowledge is and that's also something that wasn't obvious to make a judgement about prior to using procedural flow charts. (Ms. Simon)

Problem solving was seen as student-centred. If procedural flowcharts could be used to support problem solving, then they could facilitate an environment where students were the ones to do most of the work. The students could develop procedural flowcharts showing how they will solve a PSMT task using concepts and procedures they have learnt. The open-ended nature of the problem in a PSMT provides opportunities for diverse solutions that are validated through mathematical justifications. The visual nature of procedural flowcharts makes them more efficient to navigate compared to text.

Mathematics goes from being very dry and dusty to being something which is actually creative and interesting and evolving, starting to get kids actually engaging and having to back themselves. (Participant 7, phase three study) As a teacher, I find that procedural flowcharts are a really efficient way to ascertain the ways that students have considered and how they are going to solve a problem … It engages the students from start to finish, you know in different ways this method demands students to compare, interpret, analyse, reason, evaluate, and to an extent justify as they develop this solution. (Ms. Simon)

Similarly, results showed that procedural flowcharts could be used as a resource to promote collaborative learning and scaffolding. Students could be asked to collaboratively develop a procedural flowchart or could be provided with one to follow as they worked towards solving the problem. Collaborative development of procedural flowcharts can support problem solving as students can bring their different mathematical understanding to develop a solution from different perspectives.

Sometimes, you know, I get students to work on it in groups as they share ideas and get that mathematisation happening. So, it's really helpful there … I looked at the PSMT and its Marking Guide, and develop a more detailed procedural flowchart for students to use as a scaffold to guide them through the process. So, procedural flowcharts provide a structure in a more visual way for students to know what to do next. (Ms. Simon)

Ms. Simon shared her detailed procedural flowchart in Fig.  3 that she used to guide students in PSMTs.

figure 3

Ms. Simon’s procedural flowchart on problem solving

The participants also observed that procedural flowcharts could be used to promote opportunities for solution evaluation which played an important role in problem solving. Loops can be introduced in procedural flowcharts to provide opportunities for reflection and reasoning as alternative paths provide flexibility while the solution is being developed. Following Fig.  4 are participants’ comments referring to the figure which was among procedural flowcharts shared with participants as examples of how they can be used to teach syllabus identified Mathematical Methods concepts. The Mathematical Methods syllabus expects students to “recognise the distinction between functions and relations and use the vertical line test to determine whether a relation is a function” (QCAA, 2018 p. 20).

The cycle approach, the feeding back in the feeding back out that type of stuff, you know, that is when we starting to teach students how to think . (Participant 7, phase three study) Complex procedural flowcharts like the one you provided guide students in making key decisions as they work through solutions which is key to critical thinking and judgement and these two are very important in maths. (Participant 8, phase three study) I also sincerely believe that procedural flowcharts are a way to get students to develop and demonstrate the critical thinking skills, which PSMTs are designed to assess. Students inadvertently have to use their critical thinking skills to analyse and reason as they search for different ways to obtain a solution to the problem presented in the PSMT … the use of procedural flowcharts naturally permits students to develop their critical thinking skills as it gets their brain into a problem-solving mode as they go through higher order thinking skills such as analysis, reasoning and synthesis and the like … this visual way of presenting solution provides students with opportunities to think differently, which they're not used to do, and it leads them to reflect and compare. (Ms. Simon)

figure 4

Procedural flowchart on distinguishing functions and relations

Problem solving of non-routine problems uses a structure that should be followed. Resources that are intended to support problem solving in students can be used to support the integration of the stages involved in problem solving.

Theme 2 The utility of procedural flowcharts in supporting the integration of the four stages of mathematics problem solving.

Procedural flowcharts can support the flow of ideas and processes in the four stages during problem-solving and modelling task in Mathematical Methods subject. Literature synthesis in this study identified the four stages as:

Identification of problem and mathematics strategies than can solve the problem.

Implementation.

Evaluation and justification.

Communicating the solution.

Similarly, QCAA flowchart on PSMT identifies the four stages as formulate, solve, evaluate and verify, and communicate.

The logical sequencing of the stages of mathematics problem solving is crucial to solving and communicating the solution to the problem. Development of procedural flowcharts can play an important role in problem solving through fostering the logical sequencing of processes to reach a solution. Participants noted that the development of procedural flowcharts provides opportunities for showing the flow of ideas and processes which lay out an overview of how different stages connect into a bigger framework of the solution. Furthermore, it can help show how different pieces of a puzzle interconnect, in this case how all the components of the solution interconnect and develop to address the problem. In fact, procedural flowcharts can be used to show how the different mathematics concepts students have learnt can be brought together in a logical way to respond to a problem.

Procedural flowcharts help students sum up and connect the pieces together… connect the bits of knowledge together. (Participant 4, phase three study) Really good how it organises the steps and explains where you need to go if you're at a certain part in a procedure. (Participant 2, phase three study) Potentially, it's also an excellent visual presentation, which shows a student's draft of their logical sequence of processes that they're intending to develop to solve the problem … So, the steps students need to follow actually flows logically. So really given a real-life scenario they need to solve in a PSMT students need to mathematise it and turn it into a math plan, where they execute their process, evaluate and verify it and then conclude … so we use procedural flowcharts to reinforce the structure of how to approach problem-solving … kids, you know, they really struggling, you know, presenting things in a logical way, because they presume that we know what they're thinking . (Ms. Simon)

Developing procedural flowcharts provided students with opportunities to plan the solution informed by the stages of problem solving. Teachers could reinforce the structure of problem solving by telling students what they could expect to be included on the procedural flowchart. Procedural flowcharts can be used as a visual tool to highlight all the critical stages that are included during the planning of the solution.

I tell the students, “I need to see how you have interpreted the problem that you need to solve. I need to see how you formulated your model that involves the process of mathematisation, where you move from the real world into the maths world, and I need to see all the different skills you're intending to use to arrive at your solution.” (Ms. Simon)

Similarly, procedural flowcharts could visually represent more than one strategy in the “identify and execute mathematics procedures that can solve the problem” stage, thereby providing a critical resource to demonstrate flexibility. When there are multiple ways of addressing a problem, developing a procedural flowchart can provide an opportunity of showing all possible paths or relationships between different paths to the solution, thus promoting flexibility. Procedural flowcharts provide an opportunity to show how different procedures can be used or integrated to solve a problem.

Students are expected to show evidence that they have the knowledge of solving the problem using several ways to get to the same solution. So, it goes beyond the students’ preferred way of answering a question and actually highlights the importance of flexibility when it comes to processes and strategies of solving a problem … By using procedural flowcharts, I'm saying to the students, “Apart from your preferred way of solving the problem, give me a map of other routes, you can also use to get to your destination.” (Ms. Simon)

The results also indicated that procedural flowcharts could be used to identify strengths and limitations of procedures in the “evaluate solution” stage and thus demonstrate the reasonableness of the answer. Having more than one way of solving a problem on a procedural flowchart helps in comparing and evaluating the most ideal way to address the problem.

And I'm finding that, you know, as students go through, and they compare the different processes, you know, the strengths and limitations, literally stare them in the face. So, they don't have to. They're not ... they don't struggle as much as they used to in coming up with those sorts of answers … it's also a really easy way that once the students reach the next phase, which is the evaluating verified stage, they can go back to their procedural flow chart and identify and explain strengths and limitations of their model … It's a convenient way for students to show their reasonableness of their solution by comparing strengths and weaknesses of all the strategies presented on the procedural flowchart, something that they've struggled with in the past. (Ms. Simon)

The results from the interview show that the procedural flowcharts supported efficient communication of the steps to be followed in developing the solution to the problem. Student-developed procedural flowcharts allowed the teacher to have an insight and overview of the solution to the problem earlier in the assessment task. In addition, they provided an alternative way of presenting their solution to the teacher.

I expect students to use the procedural flowchart as a way to communicate to me how they're planning to solve the scenario in the PSMT…It's also one of the parts that students are expected to hand in to me on one of the check points, and I find it a really efficient way for me to look at, you know, a proposed individual students processes, and provide relevant feedback to the student to consider in a really efficient way…I just found that it helps students communicate their solution to a problem in lots of different ways that challenges students to logically present a solution. (Ms. Simon)

She went on to say,

Students also found it challenging to communicate their ideas in one or two paragraphs, when more than one process or step was required to solve the problem. So, I found that, you know, procedural flowcharts, have filled this gap really nicely, as that provides students with a simple tool that they can use to present a visual overview of the processes they've chosen to use to solve the problem. And so, for me, as a teacher, procedural flowcharts are an efficient way for me to scan the intended processes that an individual student is proposing to use to solve the problem in their authentic way and provide them with valuable feedback.

In summary, the teacher’s experiences, views and perceptions showed that procedural flowcharts can be a valuable resource in supporting students in all four stages of problem solving.

Students’ artefacts

The student-generated flowcharts in this part of the research gave an insight into students’ understanding as they planned how to solve the problem presented to them. Students were expected to use the problem-solving stages to successfully develop solutions to problems. Their de-identified procedural flowcharts are shown in Figs.  5 , 6 and 7 .

figure 5

Procedural flowchart developed by student 1

figure 6

Procedural flowchart developed by student 2

figure 7

Collaboratively developed procedural flowchart

Students 1 and 2 also collaboratively developed a procedural flowchart, shown as Fig.  7 .

This discussion is presented as two sections: (1) how developing procedural flowcharts can support mathematics problem solving and (2) how developing procedural flowcharts support the integration of the different stages of mathematics problem solving. This study although limited by sample size highlighted how developing procedural flowcharts can support mathematics problem solving, can reinforce the structure of the solution to a problem and can help develop metacognitive skills among students. The different stages involved in problem solving inform the process of developing the solution to the problem. The focus on problem-based learning has signified the need to introduce resources that can support students and teachers in developing and structuring solutions to problems. Results from this study have also provided discussion points on how procedural flowcharts can have a positive impact in mathematics problem solving.

Procedural flowcharts can support mathematics problem solving

Procedural flowcharts help in visualising the process of problem solving. The results described in this study show that student-generated flowcharts can provide an overview of the proposed solution to the problem. The study noted that students preferred developing procedural flowcharts rather than writing how they planned to find a solution to the problem. The teachers also preferred visual aids because they were easier and quicker to process and facilitated understanding of the steps taken to reach the solution. These results are consistent with the findings of other researchers (McGowan & Boscia, 2016 ; Raiyn, 2016 ). The results are also consistent with Grosskinsky and colleagues’ ( 2019 ) findings that flowcharts break complex information into different tasks and show how they are connected, thereby enhancing understanding of the process. Consequently, they allow teachers to provide timely feedback at a checkpoint compared to the time a teacher would take to go through a written draft. Procedural flowcharts connect procedures and processes in a solution to the problem (Chinofunga et al., 2022 ). Thus, the feedback provided by the teacher can be more targeted to a particular stage identified on the procedural flowchart, making the feedback more effective and worthwhile. The development of a procedural flowchart during problem solving can be viewed as a visual representation of students’ plan and understanding of how they plan to solve the problem as demonstrated in Figs.  5 , 6 and 7 .

In this study, Ms. Simon noted that procedural flowcharts can represented students’ knowledge or thinking in a visual form, which is consistent with Owens and Clements’ ( 1998 ) findings that visual representations are cognitive constructs. Consequently, they can facilitate evaluation of such knowledge. This study noted that developing procedural flowcharts can provide opportunities to identify gaps in students’ understanding and problem-solving skills. It also noted that providing students with opportunities to develop procedural flowcharts may expose students’ misconceptions, the depth and breadth of their understanding of the problem and how they plan to solve the problem. This is supported by significant research (Grosskinsky et al., 2019 ; Norton et al., 2007 ; Vale & Barbosa, 2018 ), which identified flowcharts as a resource in helping visualise and recognise students’ understanding of a problem and communication of the solution. Thus, providing teachers with opportunities to have an insight into students’ thinking can facilitate intervention early in the process. The results in this study showed that when students develop their own plan on how to respond to a problem, they are at the centre of their learning. However, scaffolding and collaborative learning can also support problem solving.

Vygotsky ( 1978 ) posited that in the Zone of Proximal Development, collaborative learning and scaffolding can facilitate understanding. In this study, the results indicated that a teacher-developed procedural flowchart can be used to guide students in developing a solution to a problem. These results are consistent with Davidowitz and Rollnick’s study that concluded that flowcharts provide a bigger picture of how to solve the problem. In Queensland, the QCAA has developed a flowchart (see Appendix 1 ) to guide schools on problem-solving and modelling tasks. It highlights the significant stages to be considered during the process and how they relate to each other. Teachers are encouraged to contextualise official documents to suit their school and classes. In such cases, a procedural flowchart acts as a scaffolding resource in directing students on how to develop the solution to the problem. The findings are consistent with previous literature that flowcharts can give an overall direction of the process, help explain what is involved, may help reduce cognitive load and allow students to focus on complex tasks (Davidowitz & Rollnick, 2001 ; Norton et al., 2007 ; Sweller et al., 2019 ).

In addition to being a scaffolding resource, results showed that procedural flowcharts can be developed collaboratively providing students with an opportunity to share their solution to the problem. Being a scaffolding resource or a resource to use in a community of learning highlights the importance of procedural flowcharts in promoting learning within a zone of proximal development, as posited by Davidowitz and Rollnick ( 2001 ). Scaffolding students to problem solve and develop procedural flowcharts collaboratively provides students with the opportunity to be at the centre of problem solving.

Research has identified problem solving as student-centred learning (Ahmad et al., 2010 ; Karp & Wasserman, 2015 ; Reinholz, 2020 ; Vale & Barbosa, 2018 ). The process of developing the procedural flowcharts as students plan for the solution provides students with opportunities to engage more with the problem. Results showed that when students developed procedural flowcharts themselves, mathematics learning transformed from students just being told what to do or follow procedures into something creative and interesting. As students develop procedural flowcharts, they use concepts they have learnt to develop a solution to an unfamiliar problem (Matty, 2016 ), thus engaging with learning from the beginning of the process until they finalise the solution. The results indicated that developing procedural flowcharts promoted students’ ability to not only integrate different procedures to solve the problem but also determine how and when the conditions were ideal to address the problem, providing opportunities to justify and evaluate the procedures that were used.

Deeper understanding of mathematics and relationships between concepts plays an important role in problem solving, and the results from this study showed that different procedures can be integrated to develop a solution to a problem. The participants observed that developing procedural flowcharts could support the brainstorming ideas as they developed the flowchart, as ideas may interlink in a non-linear way. Moreover, students are expected at different stages to make key decisions about the direction they will need to take to reach the solution to the problem, as more than one strategy may be available. For example, student 1 planned on using only technology to develop the models while student 2 considered both technology and algebra. This showed that student 2 applied flexibility in using alternative methods, thus demonstrating a deeper understanding of the problem. Equally important, Ms. Simon observed that as students developed their procedural flowcharts while planning the steps to reach a solution, they were required to analyse, conceptualise, reason, analyse, synthesise and evaluate, which are important attributes of deeper understanding. Fostering deeper understanding of mathematics is the key goal of using problem solving (Kim et al., 2012 ; King, 1995 ; Moon, 2008 ; QCAA, 2018 ). The results are additionally consistent with findings from Owens and Clements ( 1998 ) and Roam ( 2009 ), who posited that visual aids foster reasoning and show cognitive constructs. Similarly, logical sequencing of procedures and ways to execute a strategy expected when developing procedural flowchart can support deeper understanding, as posited by Parvaneh and Duncan ( 2021 ). When developing procedural flowchart, students are required to link ideas that are related or feed into another, creating a web of knowledge. Students are also required to identify the ways in which a concept is applied as they develop a solution, and this requires deeper understanding of mathematics. Working collaboratively can also support deeper and broader understanding of mathematics.

The procedural flowchart that was developed collaboratively by the two students demonstrated some of the skills that they did not demonstrate in their individual procedural flowcharts. Like student 2, the collaboratively developed flowchart included use of technology and algebra to determine the models for the three different cups. The students considered both rate of change and area under a curve in the task analysis. Apart from planning to use rate at a point, average rate and definite integration, they added the trapezoidal rule. Both average rate and definite integration were to be applied within the same intervals, building the scope for comparison. The trapezoidal rule would also compare with integration. The complexity of the collaboratively developed procedural flowchart concurred with Rogoff and others ( 1984 ) and Stone ( 1998 ), who suggested that a community of learning can expand current skills to higher levels than individuals could achieve on their own. It seems the students used the feedback provided by the teacher on their individually developed procedural flowcharts as scaffolding to develop a much more complex procedural flowchart with competing procedures to address the problem. Their individually developed flowcharts might have acted as reference points, as their initial plans were still included in the collaboratively developed plan but with better clarity. This observation is consistent with Guk and Kellogg ( 2007 ), Kirova and Jamison ( 2018 ) and Ouyang and colleagues ( 2022 ), who noted that scaffolding involving peers, teacher and other resources enhances complex problem-solving tasks and transfer of skills.

Supporting the integration of the different stages of mathematics problem solving

When students develop procedural flowcharts, it supports the logical sequencing of ideas from different stages into a process that ends with a solution. Problem solving follows a proposed order and procedural flowcharts visually display decision and/or action sequences in a logical order (Krohn, 1983 ). They are used when a sequenced order of ideas is emphasised, such as in problem solving (Cantatore & Stevens, 2016 ). This study concurs with Krohn, Cantatore and Stevens, as the results showed that procedural flowcharts could be used to organise steps and ideas logically as students worked towards developing a solution. Students’ procedural flowcharts are expected to be developed through the following stages: problem identification, problem mathematisation, planning and execution and finally evaluation. Such a structure can be reinforced by teachers by sharing a generic problem-solving flowchart outlining the stages so that students can then develop a problem-specific version. Importantly, students’ artefacts in Figs.  5 , 6 and 7 provided evidence of how procedural flowcharts support the different stages of problem-solving stages to create a logical and sequential flow of the solution (see Appendix 1 ). Similarly, Ms. Simon noted that while her students had previously had problems in presenting the steps to their solution in a logical way, she witnessed a significant improvement after she asked them to develop procedural flowcharts first. Further, the results are consistent with Chinofunga et al.’s ( 2022 ) work that procedural flowcharts can support procedural flexibility, as they can accommodate more than one procedure in the “identify and execute mathematics procedures that can solve the problem” stage. Thus, stages that require one procedure or more than one procedure can all be accommodated in a single procedural flowchart. Evaluating the different procedures is also a key stage in problem solving.

As students develop the solution to the problem and identify ways to address the problem, they also have to evaluate the procedures, reflecting on the limitations and strengths of the solutions they offer. Ms. Simon observed that her students had previously struggled with identifying strengths and weaknesses of different procedures. However, she noted that procedural flowcharts gave students the opportunity to reflect and compare as they planned the solution. For example, students could have the opportunity to reflect and compare rate at a point, average rate and integration so they can evaluate which strategy can best address he problem. The artefacts identified the different procedures the students used in planning the solution, enabling them to evaluate the effectiveness of each strategy. Thus, enhancing students’ capacity to make decisions and identify the optimal strategy to solve a problem aligns with the work of McGowan and Boscia ( 2016 ). Similarly, Chinofunga and colleagues’ findings noted that developing procedural flowcharts can be effective in evaluating different procedures as they can accommodate several procedures. The different stages that need to be followed during problem solving and the way the solution to the problem is logically presented are central to how the final product is communicated.

In this study, procedural flowcharts were used to communicate the plan to reach the solution to a problem. The length of time given to students to work on their problem-solving tasks in Queensland is fairly long (4 weeks) and students may struggle to remember some key processes along the way. Developing procedural flowcharts to gain an overview of the solution to the problem and share it with the teacher at an early checkpoint is of significant importance. In this study, Ms. Simon expected her students to share their procedural flowcharts early in the process for her to give feedback, thus making the flowcharts a communication tool. The procedural flowcharts developed by the students in Figs.  5 , 6 and 7 show how students proposed solving the problem. This result lends further support to the NCTM ( 2000 ) findings that visual representations can help students communicate their thinking before applying those thoughts to solving a problem. Ms. Simon also noted that before introducing students to procedural flowcharts, they did not have an overall coherent structure to follow, which presented challenges when they wanted to communicate a plan that involved more than one strategy. However, the students’ artefacts were meaningful, clearly articulating how the solution to the problem was being developed, thus demonstrating that procedural flowcharts can provide the structure that supports the coherent and logical communication of the solution to the problem by both teachers and students (Norton et al., 2007 ). The visual nature of the students’ responses in the form of procedural flowcharts is key to communicating the proposed solution to the problem.

Visual representations are a favourable alternative to narrative communication. Procedural flowcharts can help teachers to check students’ work faster and provide critical feedback in a timely manner. Ms. Simon noted that the use of procedural flowcharts provided her with the opportunity to provide feedback faster and more effectively earlier in the task because the charts provided her with an overview of the whole proposed solution. Considering that students are expected to write a report of 2000 words or 10 pages on the task, the procedural flowchart provides the opportunity to present large amounts of information in just one visual representation. Raiyn ( 2016 ) noted that visual representations can be a quicker way to evaluate a solution and represent large amounts of information.

The procedural flowcharts that were created by students in this study demonstrate that they can be effective in supporting the development of problem-solving skills. This study suggests that including procedural flowcharts in problem solving may support teachers and students in communicating efficiently about how to solve the problem. For students, it is a resource that provides the solution overview, while teachers can consider it as a mental representation of students’ thinking as they plan the steps to reach a solution. Student-developed procedural flowcharts may represent how a student visualises a solution to a problem after brainstorming different pathways and different decision-making stages.

Moreover, as highlighted in this study, the visual nature of procedural flowcharts may offer a diverse range of support for problem solving. Procedural flowcharts make it easy to process and provide timely feedback that in turn might help students engage with the problem meaningfully. Furthermore, they may also provide a structure of the problem-solving process and guide students through the problem-solving process. Navigating through stages of problem solving might be supported by having students design procedural flowcharts first and then execute the plan. Indeed, this study showed that the ability of procedural flowcharts to represent multiple procedures, evaluation stages or loops and alternative paths helps students reflect and think about how to present a logically cohesive solution. Importantly, procedural flowcharts have also been identified as a resource that can help students communicate the solution to the problem. Procedural flowcharts have been noted to support deeper understanding as it may facilitate analysis, logical sequencing, reflection, reasoning, evaluation and communication. Although the in-depth study involved one teacher and three artefacts from her students, which is a very small sample to be conclusive, it identified the numerous advantages that procedural flowcharts bring to mathematics learning and teaching, particularly in terms of supporting the development of problem-solving skills. The study calls for further investigation on how procedural flowcharts can support students’ problem solving.

Ahmad, A., Tarmizi, R. A., & Nawawi, M. (2010). Visual representations in mathematical word problem-solving among form four students in malacca. Procedia - Social and Behavioral Sciences, 8 , 356–361. https://doi.org/10.1016/j.sbspro.2010.12.050

Article   Google Scholar  

Anderson, J. (2014). Forging new opportunities for problem solving in Australian mathematics classrooms through the first national mathematics curriculum. In Y. Li & G. Lappan (Eds.), Mathematics curriculum in school education (pp. 209–230). Springer.

Chapter   Google Scholar  

Arcavi, A. (2003). The role of visual representations in the learning of mathematics. Educational Studies in Mathematics, 52 (3), 215–241. https://doi.org/10.1023/A:1024312321077

Artigue, M., & Blomhøj, M. (2013). Conceptualizing Inquiry-Based Education in Mathematics. ZDM, 45 (6), 797–810. https://doi.org/10.1007/s11858-013-0506-6

Artigue, M., Bosch, M., Doorman, M., Juhász, P., Kvasz, L., & Maass, K. (2020). Inquiry based mathematics education and the development of learning trajectories. Teaching Mathematics and Computer Science, 18 (3), 63–89. https://doi.org/10.5485/TMCS.2020.0505

Australia Curriculum and Reporting Authority. (2014). Mathematics proficiencies (Version 8.4) . https://www.australiancurriculum.edu.au/resources/mathematics-proficiencies/portfolios/problem-solving/

Azungah, T. (2018). Qualitative research: deductive and inductive approaches to data analysis. Qualitative Research Journal, 18 (4), 383–400. https://doi.org/10.1108/QRJ-D-18-00035

Berger, M. (1998). Graphic calculators: An Interpretative framework. For the Learning of Mathematics, 18 (2), 13–20.

Google Scholar  

Blum, W., & Leiß, D. (2007). Deal with modelling problems. Mathematical Modelling: Education, Engineering and Economics, 12 , 222. https://doi.org/10.1533/9780857099419.5.221

Bos, R., & van den Bogaart, T. (2022). Heuristic trees as a digital tool to foster compression and decompression in problem-solving. Digital Experiences in Mathematics Education, 8 (2), 157–182. https://doi.org/10.1007/s40751-022-00101-6

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3 (2), 77–101. https://doi.org/10.1191/1478088706qp063oa

Braun, V., & Clarke, V. (2012). Thematic analysis. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA Handbook of Research Methods in Psychology, Research Designs (Vol. 2, pp. 57–71). American Psychological Association.

Braun, V., & Clarke, V. (2021). One size fits all? What counts as quality practice in (reflexive) thematic analysis? Qualitative Research in Psychology, 18 (3), 328–352. https://doi.org/10.1080/14780887.2020.1769238

Bree, R. T., Dunne, K., Brereton, B., Gallagher, G., & Dallat, J. (2014). Engaging learning and addressing over-assessment in the Science laboratory: Solving a pervasive problem. The All-Ireland Journal of Teaching and Learning in Higher Education, 6 (3), 206.1–206.36. http://ojs.aishe.org/index.php/aishe-j/article/viewFile/206/290

Burns, N., & Grove, S. (2009). The practice of nursing research: Appraisal, synthesis and generation of evidence (6th ed.). St. Louis: Saunders Elsevier. https://doi.org/10.7748/ns2013.04.27.31.30.b1488

Book   Google Scholar  

Byrne, D. (2022). A worked example of Braun and Clarke’s approach to reflexive thematic analysis. Quality & Quantity, 56 (3), 1391–1412. https://doi.org/10.1007/s11135-021-01182-y

Cai, J. (2010). Helping elementary school students become successful mathematical problem solvers. In D. Lambdin (Ed.), Teaching and learning mathematics: Translating research to the elementary classroom (pp. 9–14). Reston, VA: National Council of Teachers of Mathematics.

Cantatore, F., & Stevens, I. (2016). Making connections : Incorporating visual learning in law subjects through mind mapping and flowcharts. Canterbury Law Review, 22 (1), 153–170. https://doi.org/10.3316/agis_archive.20173661

Chinofunga, M. D., Chigeza, P., & Taylor, S. (2022). Procedural flowcharts can enhance senior secondary mathematics. In N. Fitzallen, C. Murphy, & V. Hatisaru (Eds.), Mathematical confluences and journeys (Proceedings of the 44th Annual Conference of the Mathematics Education Research Group of Australasia, July 3-7) (pp. 130–137). Launceston: MERGA. https://files.eric.ed.gov/fulltext/ED623874.pdf

Colburn, A. (2000). An inquiry primer. Science Scope, 23 (6), 42–44. http://www.cyberbee.com/inquiryprimer.pdf

Cole, F. L. (1988). Content analysis: Process and application. Clinical Nurse Specialist, 2 (1), 53–57. https://doi.org/10.1097/00002800-198800210-00025

Article   CAS   PubMed   Google Scholar  

Csíkos, C., Szitányi, J., & Kelemen, R. (2012). The effects of using drawings in developing young children’s mathematical word problem solving: A design experiment with third-grade Hungarian students. Educational Studies in Mathematics, 81 , 47–65. https://doi.org/10.1007/s10649-011-9360-z

Davidowitz, B., & Rollnick, M. (2001). Effectiveness of flow diagrams as a strategy for learning in laboratories. Australian Journal of Education in Chemistry, (57), 18–24. https://search.informit.org/doi/10.3316/aeipt.129151

De Massis, A., & Kotlar, J. (2014). The case study method in family business research: Guidelines for qualitative scholarship. Journal of Family Business Strategy, 5 (1), 15–29. https://doi.org/10.1016/j.jfbs.2014.01.007

Dorier, J.-L., & Maass, K. (2020). Inquiry-based mathematics education. In S. Lerman (Ed.), Encyclopedia of mathematics education (pp. 384–388). Springer. https://doi.org/10.1007/978-3-030-15789-0_176

Doyle, K. M. (2005). Mathematical problem solving: A need for literacy. In F. Bryer, B. Bartlett, & D. Roebuck (Eds.), Proceedings Stimulating the “Action” as participants in participatory research 2 (pp. 39–45). Australia: Surfers Paradise.

Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of Advanced Nursing, 62 (1), 107–115. https://doi.org/10.1111/j.1365-2648.2007.04569.x

Article   PubMed   Google Scholar  

Elo, S., Kääriäinen, M., Kanste, O., Pölkki, T., Utriainen, K., & Kyngäs, H. (2014). Qualitative content analysis: A focus on trustworthiness. SAGE Open, 4 (1), 215824401452263. https://doi.org/10.1177/2158244014522633

English, L., & Gainsburg, J. (2016). Problem solving in a 21st-century mathematics curriculum. In L. English & D. Kirshner (Eds.), Handbook of international research in mathematics education (3rd ed., pp. 313–335). New York, NY: Routledge.

Galbraith, P., & Stillman, G. (2006). A framework for identifying student blockages during transitions in the modelling process. ZDM – Mathematics Education, 38 (2), 143–162. https://doi.org/10.1007/BF02655886

Geiger, V., Galbraith, P., Niss, M., & Delzoppo, C. (2021). Developing a task design and implementation framework for fostering mathematical modelling competencies. Educational Studies in Mathematics, 109 (2), 313–336. https://doi.org/10.1007/s10649-021-10039-y

Gencer, S. (2023). Development and use of flowchart for preservice chemistry teachers’ problem solving on the first law of thermodynamics. Journal of Chemical Education, 100 (9), 3393–3401. https://doi.org/10.1021/acs.jchemed.3c00224

Article   CAS   Google Scholar  

Grosskinsky, D. K., Jørgensen, K., & Hammer úr Skúoy, K. (2019). A flowchart as a tool to support student learning in a laboratory exercise. Dansk Universitetspædagogisk Tidsskrift, 14 (26), 23–35. https://doi.org/10.7146/dut.v14i26.104402

Guion, L. A., Diehl, D. C., & McDonald, D. (2011). Triangulation: Establishing the validity of qualitative studies. EDIS, (8), 3–3. https://doi.org/10.32473/edis-fy394-2011

Guk, I., & Kellogg, D. (2007). The ZPD and whole class teaching: Teacher-led and student-led interactional mediation of tasks. Language Teaching Research, 11 (3), 281–299. https://doi.org/10.1177/1362168807077561

Guthrie, J., Petty, R., Yongvanich, K., & Ricceri, F. (2004). Using content analysis as a research method to inquire into intellectual capital reporting. Journal of Intellectual Capital, 5 (2), 282–293. https://doi.org/10.1108/14691930410533704

Hacker, D. J., Dunlosky, J., & Graesser, A. C. (Eds.). (1998). Metacognition in educational theory and practice (1st ed.). Routledge. https://doi.org/10.4324/9781410602350

Hadamard, J. (1945). The psychology of invention in the mathematical field . Princeton, NJ: Princeton University Press.

Hankeln, C. (2020). Mathematical modeling in Germany and France: A comparison of students’ modeling processes. Educational Studies in Mathematics, 103 (2), 209–229. https://doi.org/10.1007/s10649-019-09931-5

Hooshyar, D., Ahmad, R. B., Yousefi, M., Fathi, M., Horng, S.-J., & Lim, H. (2016). Applying an online game-based formative assessment in a flowchart-based intelligent tutoring system for improving problem-solving skills. Computers and Education, 94 , 18–36. https://doi.org/10.1016/j.compedu.2015.10.013

Hsieh, H.-F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15 (9), 1277–1288. https://doi.org/10.1177/1049732305276687

Huang, X., Huang, R., & Trouche, L. (2023). Teachers’ learning from addressing the challenges of online teaching in a time of pandemic: A case in Shanghai. Educational Studies in Mathematics, 112 (1), 103–121. https://doi.org/10.1007/s10649-022-10172-2

Hughes, J. R. A., & Goodwin, J. (2014). Editors’ introduction: Human documents and archival research . University of Leicester. Chapter. https://hdl.handle.net/2381/31547

Jitendra, A. K., Dupuis, D. N., Rodriguez, M. C., Zaslofsky, A. F., Slater, S., Cozine-Corroy, K., & Church, C. (2013). A randomized controlled trial of the impact of schema-based instruction on mathematical outcomes for third-grade students with mathematics difficulties. The Elementary School Journal, 114 (2), 252–276. https://doi.org/10.1086/673199

Jonassen, D. H. (2012). Designing for decision making. Educational Technology Research and Development, 60 (2), 341–359. https://doi.org/10.1007/s11423-011-9230-5

Kaitera, S., & Harmoinen, S. (2022). Developing mathematical problem-solving skills in primary school by using visual representations on heuristics. LUMAT: International Journal on Math, Science and Technology Education, 10 (2), 111–146. https://doi.org/10.31129/LUMAT.10.2.1696

Karp, A., & Wasserman, N. (2015). Mathematics in middle and secondary school: A problem-solving approach . Charlotte, North Carolina: Information Age Publishing Inc.

Kim, K., Sharma, P., Land, S. M., & Furlong, K. P. (2012). Effects of active learning on enhancing student critical thinking in an undergraduate general science course. Innovative Higher Education, 38 (3), 223–235. https://doi.org/10.1007/s10755-012-9236-x

King, A. (1995). Designing the instructional process to enhance critical thinking across the curriculum: Inquiring minds really do want to know: Using questioning to teach critical thinking. Teaching of Psychology, 22 (1), 13–17. https://doi.org/10.1207/s15328023top2201_5

Article   MathSciNet   Google Scholar  

King, N. (2004). Using templates in the thematic analysis of text. In C. Cassell & G. Symon (Eds.), Essential guide to qualitative methods in organizational research (pp. 257–270). London, UK: Sage. https://doi.org/10.4135/9781446280119

Kingsdorf, S., & Krawec, J. (2014). Error analysis of mathematical word problem solving across students with and without learning disabilities. Learning Disabilities Research & Practice, 29 (2), 66–74. https://doi.org/10.1111/ldrp.12029

Kirova, A., & Jamison, N. M. (2018). Peer scaffolding techniques and approaches in preschool children’s multiliteracy practices with iPads. Journal of Early Childhood Research, 16 (3), 245–257. https://doi.org/10.1177/1476718X18775762

Koellner, K., Jacobs, J., & Borko, H. (2011). Mathematics professional development: Critical features for developing leadership skills and building teachers’ capacity. Mathematics Teacher Education & Development, 13 (1), 115–136. Retrieved from https://files.eric.ed.gov/fulltext/EJ960952.pdf

Krawec, J. L. (2014). Problem representation and mathematical problem solving of students of varying math ability. Journal of Learning Disabilities, 47 , 103–115. https://doi.org/10.1177/0022219412436976

Krohn, G. S. (1983). Flowcharts used for procedural instructions. Human Factors, 25 (5), 573–581. https://doi.org/10.1177/001872088302500511

Krutetskii, V. A. (1976). The psychology of mathematical abilities in schoolchildren . Chicago: University of Chicago Press.

Kules, B. (2016). Computational thinking is critical thinking: Connecting to university discourse, goals, and learning outcomes. Proceedings of the Association for Information Science and Technology, 53 (1), 1–6. https://doi.org/10.1002/pra2.2016.14505301092

Kyngäs, H., Mikkonen, K., & Kääriäinen, M. (2020). The application of content analysis in nursing science research . Cham: Springer. https://doi.org/10.1007/978-3-030-30199-6

Laughlin, P. R., Hatch, E. C., Silver, J. S., & Boh, L. (2006). Groups perform better than the best individuals on letters-to-numbers problems: Effects of group size. Journal of Personality and Social Psychology, 90 (4), 644–651. https://doi.org/10.1037/0022-3514.90.4.644

Lederman, R. P. (1991). Content analysis: Steps to a more precise coding procedure. MCN, The American Journal of Maternal Child Nursing, 16 (5), 275–275. https://doi.org/10.1097/00005721-199109000-00012

Ledin, P., & Machin, D. (2020). The misleading nature of flow charts and diagrams in organizational communication: The case of performance management of preschools in Sweden. Semiotica, 2020 (236), 405–425. https://doi.org/10.1515/sem-2020-0032

Lester, F. (2013). Thoughts about research on mathematical problem-solving instruction. The Mathematics Enthusiast, 10 (1–2), 245–278.

Lester, F. K., & Cai, J. (2016). Can mathematical problem solving be taught? Preliminary answers from 30 years of research. In P. Felmer, E. Pehkonen, & J. Kilpatrick (Eds.), Posing and solving mathematical problems: Advances and new perspectives (pp. 117–135). Springer International Publishing. https://doi.org/10.1007/978-3-319-28023-3_8

Lohse, G., Biolsi, K., Walker, N., & Rueter, H. (1994). A classification of visual representations. Communications of the ACM, 37 (12), 36–49. https://doi.org/10.1145/198366.198376

Makar, K. (2012). The pedagogy of mathematical inquiry. In R. Gillies (Ed.), Pedagogy: New developments in the learning sciences (pp. 371–397). Hauppauge, N.Y.: Nova Science Publishers.

Mason, J. (2016). When is a problem…? “When” is actually the problem! In P. Felmer, E. Pehkonen, & J. Kilpatrick (Eds.), Posing and solving mathematical problems. Advances and new perspectives (pp. 263–287). Switzerland: Springer. https://doi.org/10.1007/978-3-319-28023-3_16

Matty, A. N. (2016). A study on how inquiry based instruction impacts student achievement in mathematics at the high school level . ProQuest Dissertations Publishing. https://www.proquest.com/openview/da895b80797c90f9382f0c9a948f7f68/1?pq-origsite=gscholar&cbl=18750

McCormick, N. J., Clark, L. M., & Raines, J. M. (2015). Engaging students in critical thinking and problem-solving: A brief review of the literature. Journal of Studies in Education, 5 (4), 100–113. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.960.810&rep=rep1&type=pdf

McGowan, M. M., & Boscia, M. W. (2016). Opening more than just a bag: Unlocking the flowchart as an effective problem-solving tool. The Journal of Health Administration Education, 33 (1), 211–219.

Merriam, S. B., & Tisdell, E. J. (2015). Qualitative research: A guide to design and implementation (4th ed.). Newark: Wiley.

Moon, J. (2008). Critical thinking: An exploration of theory and practice . Routledge. https://doi.org/10.4324/9780203944882

National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics . Reston, VA: National Council of Teachers of Mathematics.

Nieuwoudt, S. (2015). Developing a model for problem-solving in a Grade 4 mathematics classroom. Pythagoras, 36 (2), 1–7. https://doi.org/10.4102/pythagoras.v36i2.275

Norton, S. J., McRobbie, C. J., & Ginns, I. S. (2007). Problem-solving in a middle school robotics design classroom. Research in Science Education, 37 (3), 261–277. https://doi.org/10.1007/s11165-006-9025-6

Nowell, L. S., Norris, J. M., White, D. E., & Moules, N. J. (2017). Thematic analysis: Striving to meet the trustworthiness criteria. International Journal of Qualitative Methods, 16 (1), 1609406917733847. https://doi.org/10.1177/1609406917733847

Ouyang, F., Chen, S., Yang, Y., & Chen, Y. (2022). Examining the effects of three group-level metacognitive scaffoldings on in-service teachers’ knowledge building. Journal of Educational Computing Research, 60 (2), 352–379. https://doi.org/10.1177/07356331211030847

Owens, K. D., & Clements, M. A. (1998). Representations in spatial problem-solving in the classroom. The Journal of Mathematical Behavior, 17 (2), 197–218. https://doi.org/10.1016/S0364-0213(99)80059-7

Ozuem, W., Willis, M., & Howell, K. (2022). Thematic analysis without paradox: Sensemaking and context. Qualitative Market Research, 25 (1), 143–157. https://doi.org/10.1108/QMR-07-2021-0092

Pape, S. J., & Tchoshanov, M. A. (2001). The role of representation(s) in developing mathematical understanding. Theory into Practice, 40 (2), 118–127. https://doi.org/10.1207/s15430421tip4002_6

Parvaneh, H., & Duncan, G. J. (2021). The role of robotics in the development of creativity, critical thinking and algorithmic thinking. Australian Primary Mathematics Classroom, 26 (3), 9–13. https://doi.org/10.3316/informit.448545849534966

Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Sage Publications.

Polya, G. (1945). How to solve it: A new aspect of mathematical method . Princeton, NJ: Princeton University Press.

Polya, G. (1957). How to solve it: A new aspect of mathematical method . Princeton: Princeton University Press.

Polya, G. (1971). How to solve it: A new aspect of mathematical method (2nd ed.). Princeton University Press.

Queensland Curriculum and Assessment Authority (QCAA). (2018). Mathematical methods. general senior syllabus . Brisbane: Queensland Curriculum and Assessment Authority. https://www.qcaa.qld.edu.au/downloads/senior-qce/syllabuses/snr_maths_methods_19_syll.pdf

Raiyn, J. (2016). The role of visual learning in improving students’ high-order thinking skills. Journal of Education and Practice, 7 , 115–121. https://www.learntechlib.org/p/195092/

Reingewertz, Y. (2013). Teaching macroeconomics through flowcharts. International Review of Economics Education, 14 , 86–93. https://doi.org/10.1016/j.iree.2013.10.004

Reinholz, D. L. (2020). Five practices for supporting inquiry in analysis. Problems Resources and Issues in Mathematics Undergraduate Studies, 30 (1), 19–35. https://doi.org/10.1080/10511970.2018.1500955

Roam, D. (2009). The back of the napkin: Solving problems and selling ideas with pictures (1st ed.). Singapore: Marshall Cavendish International (Asia) Private Limited.

Rogoff, B., Malkin, C., & Gilbride, K. (1984). Interaction with babies as guidance in development. New Directions for Child and Adolescent Development, 1984 (23), 31–44. https://doi.org/10.1002/cd.23219842305

Roth, W. M., & McGinn, M. (1998). Inscriptions: Toward a theory of representing as social practice. Review of Educational Research, 68 (1), 35–59.

Sakshaug, L. E., & Wohlhuter, K. A. (2010). Journey toward teaching mathematics through problem-solving. School Science and Mathematics, 110 (8), 397–409. https://doi.org/10.1111/j.1949-8594.2010.00051.x

Santoso, B., & Syarifuddin, H. (2020). Validity of mathematic learning teaching administration on realistic mathematics education based approach to improve problem-solving. Journal of Physics. Conference Series, 1554 (1), 12001. https://doi.org/10.1088/1742-6596/1554/1/012001

Schoenfeld, A. H. (1980). Teaching problem-solving skills. The American Mathematical Monthly, 87 (10), 794. https://doi.org/10.2307/2320787

Schoenfeld, A. H. (1983). Problem solving in the mathematics curriculum . The Mathematical Association of America.

Schoenfeld, A. H. (2013). Reflections on problem-solving theory and practice. The Mathematics Enthusiast, 10 (1/2), 9.

Schoenfeld, A. H. (2016). Learning to think mathematically: Problem-solving, metacognition, and sense making in mathematics (Reprint). Journal of Education, 196 (2), 1–38. https://doi.org/10.1177/002205741619600202

Schoenfeld, A. H., Floden, R. E., & The algebra teaching study and mathematics assessment project. (2014). An introduction to the TRU Math document suite . Berkeley, CA & E. Lansing, MI: Graduate School of Education, University of California, Berkeley & College of Education, Michigan State University. Retrieved from: http://ats.berkeley.edu/tools.html

Schreier, M. (2012). Qualitative content analysis in practice . London: SAGE.

Snyder, H. (2019). Literature review as a research methodology: An overview and guidelines. Journal of Business Research, 104 , 333–339. https://doi.org/10.1016/j.jbusres.2019.07.039

Stone, C. A. (1998). Should we salvage the scaffolding metaphor? Journal of Learning Disabilities, 31 (4), 409–413. https://doi.org/10.1177/002221949803100411

Stylianou, D. A. (2010). Teachers’ conceptions of representation in middle school mathematics. Journal of Mathematics Teacher Education, 13 (4), 325–343. https://doi.org/10.1007/s10857-010-9143-y

Sweller, J., Van Merrienboer, J. J. G., & Paas, F. (2019). Cognitive architecture and instructional design: 20 years later. Educational Psychology Review, 31 (2), 261–292. https://doi.org/10.1007/s10648-019-09465-5

Sztajn, P., Borko, H., & Smith, T. (2017). Research on mathematics professional development. In J. Cai (Ed.), Compendium for research in mathematics education (Chapter 29, pp. 213–243). Reston, VA: National Council of Teachers of Mathematics.

Tobin, G. A., & Begley, C. M. (2004). Methodological rigor within a qualitative framework. Journal of Advanced Nursing, 48 , 388–396. https://doi.org/10.1111/j.1365-2648.2004.03207.x

Torraco, R. J. (2005). Writing integrative literature reviews: Guidelines and examples. Human Resource Development Review, 4 (3), 356–367. https://doi.org/10.1177/1534484305278283

Trouche, L., Gueudet, G., & Pepin, B. (2018). Documentational approach to didactics. In S. Lerman (Ed.), Encyclopedia of mathematics education. Cham: Springer. https://doi.org/10.1007/978-3-319-77487-9_100011-1

Trouche, L., Rocha, K., Gueudet, G., & Pepin, B. (2020). Transition to digital resources as a critical process in teachers’ trajectories: The case of Anna’s documentation work. ZDM Mathematics Education, 52 (7), 1243–1257. https://doi.org/10.1007/s11858-020-01164-8

Vale, I., & Barbosa, A. (2018). Mathematical problems: The advantages of visual strategies. Journal of the European Teacher Education Network, 13 , 23–33.

Vale, I., Pimentel, T., & Barbosa, A. (2018). The power of seeing in problem solving and creativity: An issue under discussion. In S. Carreira, N. Amado, & K. Jones (Eds.), Broadening the scope of research on mathematical problem-solving: A focus on technology, creativity and affect (pp. 243–272). Switzerland: Springer.

van Garderen, D., Scheuermann, A., Sadler, K., Hopkins, S., & Hirt, S. M. (2021). Preparing pre-service teachers to use visual representations as strategy to solve mathematics problems: What did they learn? Teacher Education and Special Education, 44 (4), 319–339. https://doi.org/10.1177/0888406421996070

Voskoglou, M. (2021). Problem solving and mathematical modelling. American Journal of Educational Research, 9 (2), 85–90. https://doi.org/10.12691/education-9-2-6

Vygotsky, L. S. (1978). Mind in society . Cambridge, MA: Harvard University Press.

Zahner, D., & Corter, J. E. (2010). The process of probability problem solving: Use of external visual representations. Mathematical Thinking and Learning, 12 (2), 177–204. https://doi.org/10.1080/10986061003654240

Zhang, J. (1997). The nature of external representations in problem solving. Cognitive Science, 21 (2), 179–217. https://doi.org/10.1207/s15516709cog2102_3

Download references

Acknowledgements

We would like to acknowledge and thank the teachers and students involved in the research.

Open Access funding enabled and organized by CAUL and its Member Institutions The study was financially supported by James Cook University Higher degree by research grant.

Author information

Authors and affiliations.

College of Arts, Society and Education, James Cook University, Cairns, Australia

Musarurwa David Chinofunga, Philemon Chigeza & Subhashni Taylor

You can also search for this author in PubMed   Google Scholar

Contributions

David: developed original idea, completed literature review, data analysis and authored the first draft of the article (80%). Philemon and Subhashni contributed to the data analysis, coherence of ideas and editing of the article (10% each).

Corresponding author

Correspondence to Musarurwa David Chinofunga .

Ethics declarations

Ethics approval.

This research was approved by the Human Research Ethics Committee, James Cook University (Approval Number H8201).

Informed consent

Informed consent was sought from all participants in accordance with the above ethics approval conditions. Before seeking consent from participants, permission was first obtained from the Queensland Department of Education and school principals of participating schools. The information sheets and consent forms were sent to participating mathematics teachers. An information sheet and consent form for students to share their artifacts was send through their teacher and consent was obtained directly from them as young adults (17–18 years). Consent forms and the data collected has been deidentified to protect participants and stored in the university repository.

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

A research report which is part of a PhD study by the first author who is an experienced high school mathematics teacher in Queensland, Australia. The second and third authors are primary and secondary advisors respectively. Correspondence concerning this article should be addressed to David Chinofunga, College of Arts, Society and Education, Nguma-bada Campus, Smithfield, Building A4, Cairns, PO Box 6811 Cairns QLD 4870, Australia.

Appendix 1 An approach to problem solving and mathematical modelling

figure a

Appendix 2 Phases three and four thematic analysis themes

figure b

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Chinofunga, M.D., Chigeza, P. & Taylor, S. How can procedural flowcharts support the development of mathematics problem-solving skills?. Math Ed Res J (2024). https://doi.org/10.1007/s13394-024-00483-3

Download citation

Received : 09 February 2023

Revised : 28 November 2023

Accepted : 06 January 2024

Published : 22 February 2024

DOI : https://doi.org/10.1007/s13394-024-00483-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Problem solving
  • Procedural flowcharts
  • Problem-solving stages
  • Solution planning
  • Visual representation
  • Find a journal
  • Publish with us
  • Track your research

Help | Advanced Search

Computer Science > Computation and Language

Title: chain of logic: rule-based reasoning with large language models.

Abstract: Rule-based reasoning, a fundamental type of legal reasoning, enables us to draw conclusions by accurately applying a rule to a set of facts. We explore causal language models as rule-based reasoners, specifically with respect to compositional rules - rules consisting of multiple elements which form a complex logical expression. Reasoning about compositional rules is challenging because it requires multiple reasoning steps, and attending to the logical relationships between elements. We introduce a new prompting method, Chain of Logic, which elicits rule-based reasoning through decomposition (solving elements as independent threads of logic), and recomposition (recombining these sub-answers to resolve the underlying logical expression). This method was inspired by the IRAC (Issue, Rule, Application, Conclusion) framework, a sequential reasoning approach used by lawyers. We evaluate chain of logic across eight rule-based reasoning tasks involving three distinct compositional rules from the LegalBench benchmark and demonstrate it consistently outperforms other prompting methods, including chain of thought and self-ask, using open-source and commercial language models.

Submission history

Access paper:.

  • Download PDF
  • HTML (experimental)
  • Other Formats

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

The Economic Times

These skills can help you save your job

“I think, therefore I am” is one of the most profound statements by mathematician and philosopher Descartes. It speaks about finding truth in the midst of doubt and uncertainty — a skill that is getting increasingly more valuable today.

From an interview to the latest job talk, you might often be asked to demonstrate your skills around two terms that help you navigate towards the truth through uncertainty — critical thinking and problem solving.

Why are these skills needed today?

Given the rapid advances in technology and the way the future of work and jobs are unfolding, there is definitely more uncertainty today.

In fact, children studying in schools today may grow up to work in jobs we may not even recognise today. Jobs also are shape-shifting in some cases with newer jobs getting discovered as we adapt to working with machines.

Hence, instead of just preparing for short-term need-based jobs, inculcating the skills of critical thinking and problem solving can stand a person in good stead for newer challenges we might face.

What does critical thinking and problem solving look like in action?

Imagine going through vast amounts of information and being able to synthesise that, make logical and evidence-based conclusions. That’s the essence of critical thinking.

Continuing further to problem solving, it helps us find possible answers to a problem and work on the intended solutions.

Logic plays a key role in critical thinking. Daniel Kahneman in his seminal book “Thinking Fast and Slow” spoke about two kinds of thinking that we as humans do: Immediate, gut-based thinking that is often intuitive; and deep, deliverable, thinking.

Both kinds of thinking are required to make different kinds of decisions and to attack different kinds of problems that we will face in our work life.

From a logical point of view, there are two ways to approach this: Deductive logic and inductive logic

In a deductive logic and reasoning approach, we start from individual data points. We try to stitch the patterns we see from that and then arrive at the conclusion.

In inductive logic, we start from a possible hypothesis about the problem we are addressing. This hypothesis could be the result of our intuitive systems. Based on that, we are able to use data in a more streamlined way to either prove or disprove our hypothesis.

At every step, it is important to be aware of the possibility of bias creeping in.

Let’s look at a couple of real-life situations.

Say the customer satisfaction numbers for a company are reducing over time. How can you find a way to improve that situation? Such a problem might require both critical thinking and problem solving.

Using deductive logic, you might start looking at multiple data points across customer touch points to understand the key causes for concern.

On the other hand, using inductive logic, you might first create a hypothesis, like “this is due to customer service levels dropping in channel x.” Then, you start looking at data to see how the picture unfolds.

While both are valid approaches, the second one can save time in an urgent business situation.

Another example. These situations are often tested during interviews. Imagine you are asked “how do you estimate the market demand for petrol pumps in the city?”

Now that you know the two approaches, you can apply a similar logic and get to the possible approaches. The interviewer is looking at your thinking process, not at the exact answer.

There are tools such as structured thinking that take us through a step-by-step approach to focus on insights and problem solving. And reading is another way in which we can keep building our critical thinking skills.

This is also the reason why aptitude in reading, writing, mathematics and logical reasoning is tested in many competitive examinations.

The only difference is that the need for these skills may not end with clearing the exams. These need to be honed lifelong.

One of Coursera’s most popular courses is “learning how to learn.” That constant learnability can be our best guard against certain uncertainty.

For more news like this visit The Economic Times .

These skills can help you save your job

IMAGES

  1. 10 Essential Critical Thinking Skills (And How to Improve Them

    logical reasoning and problem solving skills

  2. Using Logical Thinking| Primary Math Problem Solving Strategy

    logical reasoning and problem solving skills

  3. 8 Important Problem Solving Skills

    logical reasoning and problem solving skills

  4. Problem Solving Skills Examples

    logical reasoning and problem solving skills

  5. Top 10 Skills Of Problem Solving With Examples

    logical reasoning and problem solving skills

  6. problem solving in logical reasoning

    logical reasoning and problem solving skills

VIDEO

  1. SHARPEN YOUR ANALYTICAL AND PROBLEM SOLVING SKILLS

  2. Logical Reasoning: Problem Solving Strategy 1

  3. Logical Reasoning

  4. Using Brain Teasers to Build Critical Thinking Skills

  5. Using Logical Thinking| Primary Math Problem Solving Strategy

  6. Logical Reasoning

COMMENTS

  1. The Most Important Logical Thinking Skills (With Examples)

    Logical thinking is problem solving based on reasoning that follows a strictly structured progression of analysis. Critical thinking, research, creativity, mathematics, reading, active listening, and organization are all important logical thinking skills in the workplace.

  2. 7 Module 7: Thinking, Reasoning, and Problem-Solving

    Characteristics of critical thinking: skepticism; identify biases, distortions, omissions, and assumptions; reasoning and problem solving skills (7.1)

  3. Logical Reasoning Skills for Effective Problem Solving

    1 What are logical reasoning skills? Be the first to add your personal experience 2 How to improve your logical reasoning skills? Be the first to add your personal experience 3 How to...

  4. How To Improve Your Logical Reasoning Skills (Plus Types)

    1. Practice conditional statements Conditional statements are one of the bases of logical reasoning. A conditional statement is a verifiable truth that's dependent on another variable or condition. For example, you might note that if your colleague responds to your email immediately, that means they're online.

  5. Problem Solving, Critical Thinking, and Analytical Reasoning Skills

    Critical Thinking 4 "Mentions of critical thinking in job postings have doubled since 2009, according to an analysis by career-search site Indeed.com." 5 Making logical and reasoned judgments that are well thought out is at the core of critical thinking. Using critical thinking an individual will not automatically accept information or conclusions drawn from to be factual, valid, true ...

  6. What Are Critical Thinking Skills and Why Are They Important?

    It makes you a well-rounded individual, one who has looked at all of their options and possible solutions before making a choice. According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills.

  7. Logical Reasoning

    Logical problem-solving strategies can be used to solve these problems. What are the types of logical reasoning? There are many different types of logical reasoning. Some basic types...

  8. Logical Reasoning Test: 100s Of Free Practice Questions (2024)

    10 tests 100 questions Logical reasoning tests are a type of psychometric test used to measure your problem-solving skills. They come in various forms, but all have the underlying purpose of assessing your logical aptitude and your ability to draw conclusions from a given set of information. Buy tests Free test What is a logical reasoning test?

  9. Critical Thinking

    Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life. You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when ...

  10. Getting started with Logical Reasoning (article)

    Anatomy of a Logical Reasoning question. A Logical Reasoning question is made up of these parts: Passage/stimulus: This text is where we'll find the argument or the information that forms the basis for answering the question. Sometimes there will be two arguments, if two people are presented as speakers.

  11. The development of the reasoning brain and how to foster logical

    This suggests that logical reasoning is hard even for educated adults, a conclusion that is supported by a wealth of psychological studies. Perhaps the most striking demonstration of the difficulty of logical reasoning was discovered by the psychologist Peter Wason in 1966 1. Wason designed a task in which he presented participants with four ...

  12. The Best Ways To Strengthen Your Logical Thinking Skills

    1. Spend time on creative hobbies Creative outlets like drawing, painting, writing and playing music can stimulate the brain and help promote logical thinking. Creative thinking naturally develops problem-solving abilities that can help you become a better performer at work.

  13. What Is Logical Thinking

    According to a global report, problem-solving, a critical and logical thinking aspect, is one of the top skills employers look for in job candidates. So, it explains the demand for logical thinking or reasoning abilities. ... The best way to define logical reasoning skills is the ability to focus on tasks and activities by following a chain of ...

  14. Does mathematics training lead to better logical thinking and reasoning

    Mathematics is often promoted as endowing those who study it with transferable skills such as an ability to think logically and critically or to have improved investigative skills, resourcefulness and creativity in problem solving. However, there is scant evidence to back up such claims. This project tested participants with increasing levels of mathematics training on 11 well-studied rational ...

  15. How logical reasoning works

    Logical reasoning is the process of using rational and systematic series of steps to come to a conclusion for a given statement. The situations that ask for logical reasoning require structure, a relationship between given facts and chains of reasoning that are sensible.

  16. What Is Logical Thinking? 8 Tips to Improve Logic

    3. Creative Hobbies. Although the left hemisphere of our brain is responsible for logical thinking, creative activities, which are mainly ruled by the right hemisphere of our brain, help promote logical thinking. Therefore, encourage your children to engage with creative activities such as playing imagination games, drawing, painting, writing ...

  17. Does mathematics training lead to better logical thinking and reasoning

    Mathematics is often promoted as endowing those who study it with transferable skills such as an ability to think logically and critically or to have improved investigative skills, resourcefulness and creativity in problem solving. However, there is scant evidence to back up such claims.

  18. Critical thinking

    reason. empathy. curiosity. critical thinking, in educational theory, mode of cognition using deliberative reasoning and impartial scrutiny of information to arrive at a possible solution to a problem. From the perspective of educators, critical thinking encompasses both a set of logical skills that can be taught and a disposition toward ...

  19. Mathematics Improves Your Critical Thinking and Problem-Solving

    Mathematics provides a systematic and logical framework for problem-solving and critical thinking. The study of math helps to develop analytical skills, logical reasoning, and problem-solving abilities that can be applied to many areas of life.

  20. Logical Reasoning Questions and Answers

    Reasoning questions allow organizations to assess a candidate's problem-solving skills, critical thinking capabilities, and capacity for logical and analytical thinking. Aptitude Questions such as Quantitative Aptitude and Logical Reasoning are considered essential skills for success in a wide range of competitive exams worldwide.

  21. How to assess reasoning skills

    Quality decision-making Reasoning skills contribute to effective decision-making. Employees who think critically and can logically evaluate information are more likely to make informed decisions based on evidence and careful analysis. Their ability to weigh options, consider potential outcomes, and anticipate risks helps mitigate errors.

  22. PDF ANALYTICAL THINKING AND PROBLEM-SOLVING

    you must use logic and reason to find the true causes of the problems. Let us look at an example and try to use logic and reason to find a cause. After you gathered enough information and have a basic understanding of the problem, there are some more questions that you should ask yourself to understand the problem and possible solutions even more.

  23. Using NRICH Tasks to Develop Key Problem-solving Skills

    Pattern spotting. Working backwards. Reasoning logically. Visualising. Conjecturing. The first two in this list are perhaps particularly helpful. As learners progress towards a solution, they may take the mathematics further (stage 3) and two more problem-solving skills become important: Generalising. Proving.

  24. Analytical Reasoning Tools and Software for Data Analysis

    Analytical reasoning is the ability to use logic, data, and evidence to solve problems and make decisions. It is a crucial skill for many fields and professions, especially in the era of big data ...

  25. How can procedural flowcharts support the development of ...

    Supporting students' problem-solving skills, solution planning and sequencing of different stages that are involved in successfully developing a meaningful solution to a problem has been a challenge for teachers. ... More importantly, they can be used to represent the logical progression of ideas and reasoning during problem solving (Roam ...

  26. Chain of Logic: Rule-Based Reasoning with Large Language Models

    Reasoning about compositional rules is challenging because it requires multiple reasoning steps, and attending to the logical relationships between elements. We introduce a new prompting method, Chain of Logic, which elicits rule-based reasoning through decomposition (solving elements as independent threads of logic), and recomposition ...

  27. These skills can help you save your job

    The skills of critical thinking and problem solving are increasingly valuable as technology advances and the future of work remains uncertain. In addition to short-term job preparation, these ...

  28. Rogers Park Montessori on Instagram: "On this #valuesfriday we're

    9 likes, 0 comments - rogersparkmontessorischool on February 23, 2024: "On this #valuesfriday we're wrapping up our series on RPMS values - 3 more to go! Today... WE ...