turtle

The Biology Corner

Biology Teaching Resources

two turtles

Genetics Practice Problems – Easy Mode

worksheet

This worksheet was created for an introductory level biology class.  Many of these students had not been successful in a regular track class, so I wanted to take a very slow approach to doing genetics.  Punnett squares can frustrate many students when they are just learning.  This worksheet is designed to move through difficulty levels, so students start with “easy mode.” 

 Easy mode has the Punnett squares already set up and students just need to fill in the boxes and count how many of each type of guinea pig is expected.      Even when I model this procedure, some of my students still have trouble figuring out how to pull letters down and across.   

 Once students have mastered this skill, they move on to  “normal mode” where they must set up the squares themselves.

For many students, this is a challenge, they seem to struggle with where to put each parent.  Some students get this right away, which leaves to classroom management issues when you are trying to help students and others are finished, and bored. 

Those students are given the “hard mode” page which is on Scottish Fold cats and requires them to figure out genotypes, set up squares, and make predictions about kittens. 

cat

Students who are working on the hard mode page are told they will not receive help right away, and that’s the point of hard mode, they need to do it on their own.  (This affords me time to work with the students who are struggling with the other pages.)

There are several practice genetics problems in my library, such as Simple Genetics Practice Problems , which you can also assign to those students who are forging ahead.  I do not do dihybrid crosses or sex-linked crosses with this class.

Grade Level:  7-12 Time Required:  Variable, depending on skill level  (10 minutes to 30 minutes)

Shannan Muskopf

The Learning Hypothesis

The Learning Hypothesis

d solving simple genetic problems

Simple Genetics Practice Problems that Don’t Use a Family Tree

I have a love/hate relationship with genetics.    It is the natural follow up to fun mitosis activities and meiosis.  It is interesting and kids seem to want to dig in immediately, but it is usually taught incorrectly using human traits AND the family tree.  Why are those two things troublesome?  I’m glad you asked.

IN A HURRY? >>> CLICK HERE TO  Grab the simple genetics practice problems.

Human traits do not follow a simple inheritance pattern..

Human traits tend to follow complex types of inheritance patterns .   This is especially true for traits like hair color and eye color that are polygenic (they have several genes that contribute to them).

The entire premise of brown and blue eyes is wrong, but it is still one of the go-to ideas when it comes to inheritance.  If that was the pattern there would only be brown or blue eyes and they would all be the same shade.   I don’t like using human traits incorrectly, because it isn’t good science.

That isn’t the only reason I think that teachers should be worried about using family trees in a group setting.

Family tree assignments are NOT inclusive.

Some kids don’t have the information to complete these assignments.  Family life is complicated and kids should not be asked to parade their story out to their peers.  I know more than a few teachers that have had uncomfortable situations pop up as a result of family tree assignments.  I’ve heard of several discovered relatives from using the DNA services available so it can feel invasive.

There are lots of ways that families are made and to continue on with assignments that don’t honor that is hurtful.  Plus as I just mentioned, it isn’t good science .

There are better ways to talk about genetics than using family trees.  If you need an autosomal recessive pedigree (or any other kind), use animals or pedigrees from royal families. If there is a specific reason you want to review a human trait, but a royal pedigree doesn’t fit consider making up a family tree for use by the entire class.

We are moving into an age where genetic information could become important.  It is important to be aware of  privacy concerns when dealing with these types of assignments.

Mendel’s experiments.

Mendel used pea plants in his experiments.  For the love of Pete, if you can’t think of anything else use pea plants over the family tree assignment. You can see things like colors in flowered plants.  You see what percentage of the offspring have white flowers compared to purple flowers.

You can also make it as fun as you want by having students pick traits and the inheritance pattern for a made up species.  Start with monohybrid crosses (only a single gene) and a simple dominant trait and recessive trait.

All kids need is a little imagination, some guidance, and a Punnett Square to have lots of fun.  You can even have them create an alien race on a fictional planet and include ELA standards.  Kids love to make brochures and presentations.  You might want to start with some basics first.

Mendelian Genetics Activities

I like to focus on dominant/recessive patterns of inheritance first.  Starting with one trait (monohybrid) and going to two traits (dihybrid) crosses.  Mendel bred pea plants and determined dominant and recessive traits based on the ratio of the phenotypes of the offspring.   He also went on to discover other inheritance patterns.

You can do worksheets and use games to explore simple genetics practice problems.   If possible, I prefer hands-on.  I teach these concepts using my Easter Egg Genetics Lab.  

In this lab, students explore monohybrid and dihybrid crosses of simple dominant/recessive patterns.  There is also an optional part 2 where students work with co-dominance and incomplete dominance patterns using a hands-on lab and genetics problems worksheet.  Students get a chance to see the difference between phenotype and genotype.   This does require setup in the beginning, but I’ve used the same eggs for 15 years now.

Related Products ⭐ Genetics Bingo Boards ⭐ Genetics Notes & Questions ⭐ Genetics Task Cards ⭐ Punnett Square Practice – Monohybrid Squares

d solving simple genetic problems

Don’t miss out!

d solving simple genetic problems

Become a VIP

Get free resources, tips, early access to discounted products and more!

Disclosures, Privacy Policy & Cookie Policy

  • Disclosures & Privacy Policy
  • Cookie Policy

All you need is enthusiasm . And coffee. Must have coffee.

Check out the new Biology Resource Hub!  All of my biology resources in one place.  Comment BIOLOGY for more information.  Founding member pricing (plus $100 worth of bonuses) expires soon. #scienceteacher #testprep #biology #biologyeoc #biologyteacher

If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

To log in and use all the features of Khan Academy, please enable JavaScript in your browser.

Biology library

Course: biology library   >   unit 16.

  • Introduction to heredity
  • Worked example: Punnett squares
  • Mendel and his peas
  • The law of segregation
  • The law of independent assortment
  • Probabilities in genetics

Monohybrid punnett squares

  • Dihybrid punnett squares
  • (Choice A)   100 % ‍   A 100 % ‍  
  • (Choice B)   0 % ‍   B 0 % ‍  
  • (Choice C)   75 % ‍   C 75 % ‍  
  • (Choice D)   25 % ‍   D 25 % ‍  
  • (Choice E)   50 % ‍   E 50 % ‍  

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.18(2); Summer 2019

Problem Solving in Genetics: Content Hints Can Help

Jennifer s. avena.

a Department of Molecular, Cellular, and Developmental Biology, University of Colorado–Boulder, Boulder, CO 80309

Jennifer K. Knight

Associated data.

Problem solving is an integral part of doing science, yet it is challenging for students in many disciplines to learn. We explored student success in solving genetics problems in several genetics content areas using sets of three consecutive questions for each content area. To promote improvement, we provided students the choice to take a content-focused prompt, termed a “content hint,” during either the second or third question within each content area. Overall, for students who answered the first question in a content area incorrectly, the content hints helped them solve additional content-matched problems. We also examined students’ descriptions of their problem solving and found that students who improved following a hint typically used the hint content to accurately solve a problem. Students who did not improve upon receipt of the content hint demonstrated a variety of content-specific errors and omissions. Overall, ultimate success in the practice assignment (on the final question of each topic) predicted success on content-matched final exam questions, regardless of initial practice performance or initial genetics knowledge. Our findings suggest that some struggling students may have deficits in specific genetics content knowledge, which when addressed, allow the students to successfully solve challenging genetics problems.

INTRODUCTION

Problem solving has been defined in the literature as engaging in a decision-making process leading to a goal, in which the course of thought needed to solve the problem is not certain ( Novick and Bassok, 2005 ; Bassok and Novick, 2012 ; National Research Council, 2012 ; Prevost and Lemons, 2016 ). Ample research shows that students have difficulty learning how to solve complex problems in many disciplines. For example, in biology and chemistry, students often omit critical information or recall information incorrectly and/or apply information incorrectly to a problem ( Smith and Good, 1984 ; Smith, 1988 ; Prevost and Lemons, 2016 ). Furthermore, across many disciplines, researchers have found that experts use different procedural processes than nonexperts when solving problems ( Chi et al. , 1981 ; Smith and Good, 1984 ; Smith et al. , 2013 ). While students often identify problems based on superficial features, such as the type of organism discussed in a problem, experts identify primary concepts and then link the concept with strategies on how to solve such a problem ( Chi et al. , 1981 ; Smith and Good, 1984 ; Smith et al. , 2013 ). Experts also often check their work and problem solutions more frequently than nonexperts ( Smith and Good, 1984 ; Smith, 1988 ). Given the difficulties students have in problem solving and the value of such skills to their future careers, there is clearly a need for undergraduate educators to assist students in developing problem-solving skills ( American Association for the Advancement of Science, 2011 ; National Research Council, 2012 ).

Two kinds of knowledge have been described in the literature as important for solving problems: domain specific and domain general. Domain-specific knowledge is knowledge about a specific field, including the content (declarative knowledge), the procedural processes used to solve problems (procedural knowledge), and how to apply content and process when solving problems (conditional knowledge; Alexander and Judy, 1988 ). Domain-general knowledge is knowledge that can be used across many contexts ( Alexander and Judy, 1988 ; Prevost and Lemons, 2016 ). A third category, strategic knowledge, is defined as knowledge about problem-solving strategies that can be domain specific or domain general ( Chi, 1981 ; Alexander and Judy, 1988 ). Research suggests that domain-specific knowledge is needed, but may not be sufficient, for applying strategic knowledge to solve problems ( Alexander and Judy, 1988 ; Alexander et al. , 1989 ). Thus, helping students learn to solve problems likely requires teaching them how to activate their content knowledge, apply their knowledge to a problem, and logically think through the problem-solving procedure.

Previous research suggests that receiving help in a variety of forms, including procedure-based prompts ( Mevarech and Amrany, 2008 ), a combination of multiple content- and procedure-based prompts ( Pol et al. , 2008 ), and models ( Stull et al. , 2012 ), can be beneficial to learning.   Not surprisingly, accessing relevant prior knowledge has been shown to positively influence performance ( Dooling and Lachman, 1971 ; Bransford and Johnson, 1972 ; Gick and Holyoak, 1980 ). For example, in genetics, successful problem solvers often identify similarities between problems, whereas unsuccessful problem solvers do not ( Smith, 1988 ). Previous research also suggests that receiving procedural guidance can be beneficial to learning. In a study that asked students to examine different problems with related solutions, prompting students to consider previously reviewed problems helped most students subsequently solve a challenging problem ( Gick and Holyoak, 1980 ). In another study, when students received guidance that included identifying similarities to other problems as well as other procedural skills, such as planning and checking their work, they were better able to solve subsequent problems than in the absence of such guidance ( Mevarech and Amrany, 2008 ). However, although accessing prior knowledge is important, it is also important that students understand how to apply their prior knowledge to a given problem ( Bransford and Johnson, 1972 ). Thus, while students may realize they need additional information to solve a problem, if they cannot make sense of this information in the context of a given problem, the information is unlikely to be useful.

In addition to knowledge, students need practice. Within the field of psychology, many studies have examined the association between practice and performance. Completing a practice test leads to better performance on a subsequent final test compared with other conditions in which students do not test themselves, such as studying or completing an unrelated or no activity (e.g., Roediger and Karpicke, 2006 ; Adesope et al. , 2017 ). In a meta-analysis, this effect, termed the “testing effect,” was found to occur regardless of whether feedback was given and regardless of the time between the practice test and the final test ( Adesope et al. , 2017 ). The benefits of practice testing on later performance can occur not only when using the same questions (retention) but also when students are asked to transfer information to nonidentical questions, including questions that require application of concepts. In one of the few studies on the testing effect using transfer questions, students who took practice tests performed better on transfer questions on a final test for both factual (i.e., a single fact in a sentence) and conceptual (i.e., a cohesive idea across multiple sentences) questions than those who studied but did not take practice tests ( Butler, 2010 ). This study also found that those who performed well on their practice tests were more likely to do well than those who performed poorly on their practice tests 1 week after practice on a subsequent final test, which included conceptual questions that required application ( Butler, 2010 ).

In the current study, we focused on whether students who are incorrectly solving a problem can apply content knowledge given to them as a prompt to correctly solve subsequent genetics problems. We address the following questions: 1) Does providing a single content-focused prompt help students answer similar questions during subsequent practice, and does this practice help on later exams? 2) When unable to apply content prompts, what content errors and omissions do students make that lead them to continue to answer incorrectly?

Participants

We invited students enrolled in an introductory-level undergraduate genetics course for biology majors (total of 416 students in the course) at a 4-year institution during Spring 2017 to complete each of two practice assignments containing content related to course exams. The first practice assignment was taken immediately before a unit exam, and the second assignment was taken either immediately before the next unit exam or after this exam in preparation for the cumulative final exam (see Supplemental Figure S1 for timeline). Each assignment was offered online (using the survey platform Qualtrics) for up to 6 points of extra credit (650 total course points). Students received 4 points for answering the question with an explanation of their problem-solving process and an additional 2 points if they answered correctly. The practice assignments were announced in class and by email, with encouragement to complete the assignment as preparation for an upcoming exam. Students had the option to consent to have their answers used for research purposes, and all students who completed the assignment received credit regardless of their consent.

Course Performance Metrics

Students in the course were given the option to complete the Genetics Concept Assessment (GCA; Smith et al. , 2008 ) online at the beginning of the semester (within the first week of classes) for participation extra credit. The 25 GCA questions address eight of the 11 learning objectives taught in this course. Initial performance on the GCA is reported as the pretest. Students answered the same GCA questions again on the cumulative final exam, for credit, along with instructor-generated questions that also addressed the content from practice assignments along with other course content. The instructor-generated questions on the final exam comprised 15% of the student’s final course grade, and the GCA questions comprised just under 8% of the student’s final course grade.

Practice Assignment Content

We selected content areas known to be challenging for genetics students ( Smith et al. , 2008 ; Smith and Knight, 2012 ) and developed sets of questions on the following five topics: calculation of the probability of inheritance across multiple generations (“probability”), prediction of the cause of an incorrect chromosome number after meiosis (“nondisjunction”), interpretation of a gel and pedigree to determine inheritance patterns (“gel/pedigree”), prediction of the probability of an offspring’s genotype using linked genes (“recombination”), and determination of the parental germ line from which a gene is imprinted (“imprinting”).

For each content area, we wrote three questions intended to be isomorphic that had the following characteristics: they addressed the same underlying concept but used different superficial characteristics, targeted higher-order cognitive processes as assessed by Bloom’s level ( Bloom et al. , 1956 ), contained the same amount of information, and required students to perform similar processes to solve the problem. The questions were in constructed-response format but had a single correct answer, and each question also had a coinciding visual aid (example in Figure 1 ; see all questions in the Supplemental Material). The questions were initially based on previously used exam questions in the course and were tested and modified through individual think-aloud interviews (16 students and seven genetics faculty) and/or a focus group (three students).

An external file that holds a picture, illustration, etc.
Object name is cbe-18-ar23-g001.jpg

Example of a practice question used for problem solving on the content of nondisjunction. Each question in the study had a visual aid, was constructed response, and had a single correct answer.

The three questions within a given content area (referred to as a “trio”) were given sequentially in the practice assignments, with the first, second, and third questions referred to as “Q1,” “Q2,” and “Q3,” respectively. For each problem-solving assignment, we randomized for each student the order of the three questions within each content area and the order in which each content area was presented. In the first problem-solving assignment, to prevent fatigue, students answered two of three randomly assigned content areas (probability, nondisjunction, and gel/pedigree), and for the second assignment, students completed questions on both recombination and imprinting.

Experimental Conditions

We developed content-focused prompts (referred to hereafter as “content hints”) based on common student errors revealed during in-class questions and previous exams for this course and/or during individual student think-aloud interviews. Each hint addressed the most common student error and contained only a single content idea ( Table 1 ). In each online practice assignment, we randomly assigned students to one of two conditions: an optional content hint when taking the second question of a content trio (hint at Q2) or an optional content hint when taking the third question of a content trio (hint at Q3). The first question (Q1) served as a baseline measure of performance for all students. At Q2, we compared the performance of students in the two conditions to determine the effect of a hint versus practice only. At Q3, we compared the performance of students within each condition with their performance on Q2 to determine whether performance was maintained (for hint at Q2 condition) or the hint improved performance compared with Q2 (for hint at Q3 condition). Using this randomized design, we could examine the differential effect of practice versus a hint, while still giving all students a chance to receive hints.

Content hints

Either at Q2 or at Q3 (depending on the condition), students were asked to respond to the following question: “Do you want a hint to solve this problem (No penalty)? If so, click here.” If they clicked, the hint appeared immediately below the problem, so students could see the hint while solving the problem. By asking students to select the hint rather than just showing it to everyone, we could track who chose to take a hint and thus distinguish between hint takers and non–hint takers. We did not provide real-time feedback to students, because the provided hints were intended to serve as a scaffolding mechanism without individual feedback. In addition, it would have been challenging to provide feedback, because the online platform used did not allow for personalized feedback and because the student answers were constructed response and could not be automatically graded.

Problem-Solving Content and Errors

We instructed students to explain in writing their thinking and the steps they were taking to solve the problem before they provided the final answer to each question ( Prevost and Lemons, 2016 ). Students were not allowed to return to a question once answered. The instructions at the beginning of the assignment outlined an example of how to do this (see the Supplemental Material), and students were able to reread the instructions and an example, if desired, during the assignment. In this study, we only tracked student performance and their use of language regarding the content hint, not their thinking or problem-solving steps.

We categorized student content-specific errors and omissions and also the use of language related to the content hint. The two authors reviewed a selection of student answers to develop an initial set of codes. We then independently coded, over three iterations, the same 66 of 456 selected answers. After each iteration, we discussed our codes to come to a consensus and revised the coding scheme as needed to represent student answers. We coded an additional 19 answers to reach a final interrater agreement of 85% (Cohen’s kappa of 0.83). Because we had coded and agreed upon 19% of the student answers at this point and our agreement was above acceptable levels ( Landis and Koch, 1977 ), we then each coded half of the remaining 371 answers independently and discussed and resolved any concerns.

Statistical Analysis

We scored student answers on the practice assignments as incorrect (0) or correct (1) and used performance data only from students who provided a final answer to all possible questions in one or both assignments. We analyzed data from 233 students: 133 students completed both practice assignments, 54 students completed only the first assignment, and 46 students completed only the second assignment. Where content areas are not specified, we report results on all content areas together. We analyzed patterns at the level of the individual answer and used logistic regressions to compare answer performance between conditions, content areas, and progression groups, treating performance on one content area as independent from another content area. A student’s performance within a single content area for Q1, Q2, and Q3 was treated as dependent (i.e., a repeated measure), and we used McNemar’s test to analyze differences in percentage correct between questions. To examine trends at the student level, we used ordinary least-squares (OLS) regression analysis.

For the analysis of student content language use and content errors, we excluded any trios in which one answer could not be coded (i.e., no problem solving described: 36 answers) or for which there was not enough explanation to be interpretable (31 answers). A total of 342 answers are discussed in this study. We used logistic regression to compare the presence of content-specific language between differing groups within the same hint condition.

For the GCA and instructor-generated final exam questions, we report performance as percentage correct. We excluded GCA pretest scores for individuals who took less than 6 minutes to complete the online questionnaire with the GCA or did not finish at least 85% of the questions. For both the GCA and the instructor-generated final exam (a total of 150 points), a subset of questions addressed the same content areas as the practice assignment questions and are termed “practice-related” questions in this study. For the GCA, practice-related questions included one multiple-choice question per content area (questions 10, 20, 24, 25) for a total of 8 points. For the instructor-generated final exam, there were two short-answer questions on nondisjunction and recombination and one multiple-choice question on probability, worth a total of 21 points. We also calculated performance on the remaining questions from the GCA and from the instructor-generated final exam (“practice-unrelated” questions). We used OLS regression analysis to examine the association between a student’s practice assignment and exam performance, and we report unstandardized beta coefficients. We used average performance on practice Q3 questions (“practice Q3 correct”), a measure of practice success, as the predictor. We also included average performance on Q1 (“practice Q1 correct”) in the regression models. For assessment performance analyses, we examined only students who completed both practice assignments (three total content areas) to ensure that all practice predictor variables were calculated based on the same number of questions (three Q3s and Q1s). Out of 133 students who completed both practice assignments, 109 students completed the GCA pre- and posttest and instructor-generated final exam and thus were included in the OLS models. The OLS regression model was the following for the GCA and instructor-generated exam questions, both practice related and practice unrelated:

equation image

We also compared assessment outcomes for students who completed the GCA at both time points and the final exam but did not complete any practice assignments ( n = 35) with those who completed all assessments and practice assignments (via OLS or independent t tests, as indicated). For this analysis, the OLS regression model was the following for the GCA and instructor-generated exam questions, both practice-related and practice-unrelated:

equation image

We used Stata v. 15.0 and R v. 3.3.3 (dplyr, VennDiagram, statmod, VGAM, irr packages) for all statistical tests. The cutoff for statistical significance was defined as an alpha of 0.05.

Human Subjects Approval

This work was reviewed by the University of Colorado Institutional Review Board, and the use of human subjects was approved (protocols 16-0511 and 15-0380).

Practice Problem-Solving Performance: Question Difficulty

By randomizing the order in which students answered each question within a content area, we were able to use student performance on the first question to compare the difficulty of each of the three questions. For all content areas except imprinting, the questions were isomorphic (χ 2 , p > 0.05), and answering the imprinting question did not influence student performance on recombination questions (taking recombination question first vs. second in the practice assignment; logistic regression, p > 0.05). Therefore, from this point on, all data presented represent the four remaining content areas: probability, nondisjunction, gel/pedigree, and recombination.

Two hundred thirty-three students answered a total of 553 trios of questions (Q1, Q2, Q3). The number of trios answered varies for each content area, because not all students answered all questions or completed both assignments: In the first assignment, students answered trios in two out of three content areas (randomly assigned), and in the second assignment, all students answered the trio of questions on recombination. We first examined the performance of all students across all four content areas and then for each content area individually ( Table 2 ). For all content areas combined, student performance increased from question 1 (Q1) to questions 2 (Q2) and 3 (Q3). Upon examination of each content area individually, however, we found that the percentage of correct answers increased from Q1 to Q3 in recombination and gel/pedigree, but not for the content areas of nondisjunction and probability. In comparing Q1 performance between content areas, students had a higher percent correct for gel/pedigree and nondisjunction questions than for probability and recombination questions and a higher percent correct for probability than for recombination ( Table 2 ).

Performance on practice problem-solving questions a

Hint Choice

Although all students were given the option to receive a content hint for each content area during practice assignments, they only took this option in 68% of trios overall (Supplemental Table S1). Students who were offered the hint at Q2 were equally likely as those who were offered the hint at Q3 to take a hint for any given content area. For the most difficult content area (recombination), students chose to take a hint more often than for the easier content area of gel/pedigree. When looking at performance across all content areas combined, students who took a hint in a given trio scored significantly lower on all three questions than students who did not take a hint in a given trio (Supplemental Table S2). This pattern, while not always significant, was also seen in each individual content area (Supplemental Table S2). Additionally, across all content areas combined, answers did not show improvement, on average, from Q1 to Q3 in trios in which a hint was not taken, while they did in trios in which a hint was taken. This difference was also significant in the individual content area of recombination, but not the other content areas (Supplemental Table S2). To maintain reasonable sample sizes in our analyses, we combined all content areas together for the remainder of the data in this paper regarding practice performance.

We also characterized students’ initial Q1 performance based on frequency of taking a hint. To best represent whether a student had a consistent pattern in hint choice, we focused on only the students who completed questions in both practice assignments (the maximum of three content areas). Of the 133 students who completed both assignments, 14 students never chose to take a hint, 56 students sometimes chose to take a hint, and 63 students always chose to take a hint when offered. Students who never took a hint performed better on Q1 than students who always took a hint (Supplemental Table S3). We have not further analyzed answer trios in which a student chose not to take a hint for several reasons. We did not have a randomization process for hint presentation: all students were given the option, and those who did not take a hint chose not to do so for reasons that we could not directly examine. In addition, because so few of the students in the study chose to never take a hint, and because we were primarily interested in the effect of taking a content hint on student success, we focused on the students who did take a hint, randomized to either Q2 or Q3 within a trio.

Content Hints Help a Subset of Students

To examine the immediate effect of a content hint on student performance, we focused the remainder of our analyses on situations in which students took a hint. We used Q1 as a baseline measure of student performance in a given content area. Because students were offered a hint either at Q2 or at Q3, we compared student performance at Q2 in the presence or absence of a hint for this question. To examine whether performance was maintained (for hint at Q2 condition) or whether the hint improved performance compared with Q2 (for hint at Q3 condition), we examined performance at Q3. For the students who took a hint, we first looked at aggregate data at the level of individual answers, binning answers into Q1 correct versus incorrect and then looking at performance on the subsequent two questions ( Figure 2 ). As shown in Figure 2 A, if students answered Q1 correctly within a trio, 15% went on to answer Q2 incorrectly (without a hint), indicating that practice itself may not help these students who initially answer correctly. Students who did receive a hint at Q2 performed the same as those who did not, indicating the drop in performance from Q1 to Q2 was not due to the hint. In a given trio, Q3 performance also did not differ based on when a hint was received, and performance, on average, did not change from Q2 to Q3, indicating that a hint did not positively or negatively impact performance for these students who initially answer correctly.

An external file that holds a picture, illustration, etc.
Object name is cbe-18-ar23-g002.jpg

The effect of a hint differs depending on Q1 correctness. (A) Q1 incorrect: the percent of correct answers for Q2 and Q3 is shown for trios in which a hint was taken at Q2 ( n = 84 trios) or at Q3 ( n = 110 trios). *, p < 0.05; all else NS, p > 0.05 (logistic regression between conditions; McNemar’s test between Q2 and Q3 for each condition). (B) Q1 correct: the percent of correct answers for Q2 and Q3 is shown for trios in which a hint was taken at Q2 ( n = 89 trios) or at Q3 ( n = 91 trios). There were no significant differences between conditions (logistic regression, p > 0.05) or between Q2 and Q3 (McNemar’s test, p > 0.05).

If students answered Q1 incorrectly within a trio, 21% went on to answer Q2 correctly without a hint, suggesting that practice alone can help these students who initially answer incorrectly ( Figure 2 B). However, a significantly higher percent of students answered correctly upon receiving a hint at Q3. Students who took the hint at Q2 were significantly more likely to get Q2 correct than students who had not yet taken a hint, indicating the hint provides an added benefit beyond practice itself. A similar percent of the students who took a hint at Q2 also answered Q3 correctly, indicating that, on average, they maintained performance on a subsequent question after the hint was taken. By the third question in a content area, all students had received a hint, some at Q2 and some at Q3. Those who took a hint at Q3 performed equivalently on Q3 to those who had taken a hint at Q2, indicating that students benefited similarly at the end of practicing a given content area, regardless of when the hint was received.

To examine how individual students performed sequentially on a trio of questions, we followed the progression of individual students from Q1 to Q3 ( Figures 3 and ​ and4). 4 ). Students took a hint at Q2 in 173 trios of questions ( Figure 3 ). Of these, 49% of students in a given trio answered Q1 incorrectly. Thirty-seven percent of those moved on to get Q2 correct when they received a hint, and then 68% of those went on to get Q3 correct. Thus, the majority, but not all students, maintained this improvement from Q2 to Q3. Students took a hint at Q3 in 201 trios of questions ( Figure 4 ). Of these, 55% of students in a given trio answered Q1 incorrectly. Seventy-nine percent of those also got Q2 incorrect, and then 26% of those moved on to get Q3 correct when they received a hint. As seen in Figures 3 and ​ and4, 4 , while a hint helped some students answer a subsequent question correctly, a hint did not help all students; some students answered Q1, Q2, and Q3 incorrectly despite taking a hint.

An external file that holds a picture, illustration, etc.
Object name is cbe-18-ar23-g003.jpg

Student-level progression across answer trios in which a hint was taken at Q2. Percent of correct answers is shown with the number of answers in each category (e.g., Q1 incorrect) in parentheses. Arrows indicate the percent of answers that track to the next category. Bolded arrows signify categories of trios that were analyzed for content-specific language use and errors/omissions: trios with Q1 incorrect but Q2 and Q3 correct (011 group) and those with all three answers incorrect (000 group).

An external file that holds a picture, illustration, etc.
Object name is cbe-18-ar23-g004.jpg

Student-level progression for answer trios in which a hint was taken at Q3. Percentage of correct answers is shown with the number of answers in each category (e.g., Q1 incorrect) in parentheses. Arrows indicate the percent of answers that track to the next category. Bolded arrows signify categories of trios that were analyzed for content-specific language use and errors/omissions: trios with Q1 and Q2 incorrect but Q3 correct (001 group) and those with all three answers incorrect (000 group).

Content-Specific Language Use and Errors or Omissions

To further explore why the hint did not help some students but did help others, we examined how students used the given content hint. We categorized within a student’s documented problem-solving answer 1) the presence of language that reflected the content described in the hint (coded as present or absent; Table 3 ), and 2) the types of content errors and omissions made in solving the problem, tracking both correctness and language use across the three questions (Q1, Q2, Q3) for each content area ( Table 4 ). Only the following selection of students who answered Q1 incorrectly and took a hint were considered for this analysis (see bolded arrows in Figures 3 and ​ and4): 4 ): students in a given trio who answered Q2 and Q3 correctly after taking a hint at Q2 (defined as 011), those who answered correctly after taking a hint at Q3 (defined as 001), and those who answered incorrectly on all three questions (defined as 000). Students who shifted from incorrect at Q1 to correct at Q2 or Q3 (011 and 001 students, respectively) more often used language associated with the content of the hint than students who answered all three questions incorrectly. In cases in which students took a hint at Q2, 83% of answers in the 011 group contained language reflecting the hint content compared with 55% in the 000 group ( n = 40 and 74 Q2 and Q3 answers, respectively; logistic regression, odds ratio [OR] = 3.8, p < 0.01). Similarly, when students took a hint at Q3, 91% of answers in the 001 group contained language reflecting the hint content compared with 60% in the 000 group ( n = 23 and 60 Q3 answers, respectively; logistic regression, OR = 9.2, p < 0.01).

Presence of language reflecting content in hint criteria, coded only in answers during and after receipt of a hint

Content errors and omissions codes

Students who continued to answer incorrectly (000 group) displayed a wide variety of content-specific errors and omissions, including multiple errors or omissions within a single answer. Figure 5 shows these errors and omissions for Q1 through Q3 categorized by content area, with each error type or omission represented by different colored circles. For each content area, the orange shading represents an error or omission related to the hint content; the other colors represent different errors or omissions specific to each content area and not related to the content hint. Details for each content area for the 000 group are given in the following sections.

An external file that holds a picture, illustration, etc.
Object name is cbe-18-ar23-g005.jpg

Presence of content errors and omissions in incorrect answers in four critical content areas in genetics. The number of answers in which each content error/omission code was observed is shown, with overlap in color indicating the presence of multiple errors/omissions within a single answer. Only 000 progression groups are shown for all questions Q1–Q3. In each case, orange shading indicates an error aligned with the hint content.

Recombination

In the recombination questions, the most common error in the 000 group was no use of map units to solve the problem (57% of 143 answers; Figure 5 A, orange oval). In addition, students made three other types of errors, sometimes in addition to the most common error. In some answers, while map units were used, they were used incorrectly (29%; Figure 5 A, blue oval). Students also made errors in gamete-type identification in which they incorrectly assigned the type of gamete (recombinant or parental) or assigned the probability of recombination to the nonrecombinant gamete (22%; Figure 5 A, green oval). Less often, students incorrectly identified the desired genotype to solve the problem (4%; Figure 5 A, magenta oval). Even after receiving the hint defining map distance, many students made the most common error of not using map units to solve the problem (“No use of map units”; 49% of 67 answers), even though some of these students ( n = 12) used the content language of the hint.

Probability

In the probability questions in this study, students needed to appropriately assign offspring having a probability of 2/3 for a certain genotype based on information about the parents and the mode of inheritance (due to one possible offspring genotype from a parental mating being eliminated). The two most common errors in the 000 group were incorrectly assigning at least one genotype or probability (which includes not using the 2/3 probability correctly; 81% of 67 answers; Figure 5 B, orange circle) and not using or improperly using the product rule for multiplying multiple independent probabilities (64%; Figure 5 B, green circle). These two errors were most commonly present in combination in the same answer (40%; Figure 5 B). While not as common, student answers sometimes contained the error of inaccurate use of modes of inheritance or calculations, either alone or in combination with other errors (21%; Figure 5 B, blue circle). Even after receiving the hint about the 2/3 probability, many students made incorrect genotype or probability assignments (“Genotype/probability misassignment”; 70% of 33 answers), even though some of these students ( n = 5) used the content language of the hint.

Gel/Pedigree

Gel/pedigree was one of the two higher-performing categories (the other being nondisjunction), so there are fewer answers in the 000 group. In these problems, students were asked to interpret both a gel and pedigree to determine inheritance patterns. To most accurately answer the gel/pedigree questions, examination of the molecular gel information to inform the number of chromosome copies present was needed. The omission of not discussing the number of alleles per gene in males and females was most common (91% of 23 answers; Figure 5 C, orange circle), and while only a few answers contained this single omission, many answers contained this omission in addition to other errors/omissions of not clearly using the provided gel (57% total; Figure 5 C, green circle) and incompletely defining a mode of inheritance (26% total; Figure 5 C, blue circle). Even after receiving the hint about X chromosome allele number, many students made the most common omission of not discussing the number of alleles per gene in males and females (“No discussion of copy number”; 88% of 8 answers), and none of these students used the content language of the hint.

Nondisjunction

In the nondisjunction problems, students were asked to identify the cause of an incorrect chromosome number after meiosis. Three errors in understanding of meiosis were present at similar levels in answers in the 000 group, including students not accurately describing homologues versus sister chromatids and/or in what phase they separated at the metaphase plate (30% of 33 answers; Figure 5 D, orange circle), students not sufficiently understanding that phases in meiosis (I or II) should be considered and differentiated (42%; Figure 5 D, green circle), and students not understanding the typical outcome of meiosis or how errors could occur (33%; Figure 5 D, blue circle). After receiving the hint describing chromosome alignment during meiosis, several students still made the error of not accurately describing homologues versus sister chromatids and/or in what phases they separated in meiosis (“Incorrect chromosome definition/separation rules”; 38% of 13 answers), even though some of these students ( n = 3) used the content language of the hint.

Practice Is Associated with Higher Longer-Term Assessment Performance

In addition to the immediate impact of a hint on student performance during a practice assignment, we also examined whether practice itself was associated with longer-term performance on a final exam. Of the 233 students who completed practice assignments, 133 completed both assignments, and 100 completed only one assignment. To ensure that all practice predictor variables were calculated based on the same number of questions (three Q1s and Q3s), we focused on only the students who completed both practice assignments. Of the 133 students who completed both assignments, 109 of these students completed the GCA pre- and posttest and instructor-generated final exam: These are the students included in the final analyses reported in Table 5 and Supplemental Tables S4 and S5. Using the mean performance on Q3 practice questions as a measure of “success” in the practice assignments (Supplemental Table S4), we found that, for students who completed both practice assignments, success in practice significantly predicted both GCA posttest and instructor-generated question performance for practice-related questions (controlling for mean Q1 performance and GCA pretest performance; Table 5 , models 1 and 2). These students also had significantly higher scores on practice-unrelated GCA posttest and instructor-generated questions ( Table 5 , models 3 and 4).

OLS regression estimates of the association between practice performance and final exam performance a

* p < 0.05.

** p < 0.01.

*** p < 0.001.

Finally, we examined whether there was a difference in final exam performance between students who did not complete any practice assignments and those who completed both assignments. There were 35 students who did not complete any practice assignments but did complete the GCA pre- and posttest and instructor-generated final exam. We used GCA pretest scores to control for potential differences in incoming genetics knowledge between the group of students who completed both practice assignments and those who completed none, although we could not control for other factors, such as motivation or interest. There was no significant difference in the GCA pretest scores between these two groups (Supplemental Table S4), but students who completed the practice questions had higher GCA posttest and instructor-generated final exam scores than students who did not practice (Supplemental Table S5).

Content Hints Help a Subset of Students during Problem-Solving Practice

We administered genetics practice problems to students on concepts that had already been presented and practiced in class. Overall, we found that some students benefit from this practice, in particular if they initially answer incorrectly. Owing to the design of our study, each student completed at least one question (Q1) within a content area without any assistance. Students then received a hint on one of the subsequent questions. This provided students with the opportunity to struggle through the first question for each concept on their own before receiving assistance. An initial struggle without assistance, followed by feedback, has been shown to help students’ future performance ( Kapur and Bielaczyc, 2012 ), and although we did not provide feedback to students about whether they were correct or incorrect in their initial answers, we gave all students a chance to receive scaffolding via a content hint. For students who had initially answered Q1 incorrectly, when they took a content hint while answering Q2, 37% answered correctly, while only 21% of students answered this question correctly if they did not take a hint at Q2. This difference of 16% indicates that, although practice alone can help, practice with content scaffolding helps more students. In addition, we have demonstrated that students benefit from a content hint regardless of whether they receive that hint at the second question or at the third question. This suggests that students who are learning from the hint at Q2 are able to apply this knowledge in answering the next question. Once they receive a key piece of content, the students who use the hint successfully continue to do so on future problems.

Not all students in this study chose to take an offered hint when solving practice problems. Students who did not take a hint for a particular trio had a higher Q1 score than students who did take a hint. Along with these baseline differences in performance, several possible factors could have influenced students’ choices. One component of student choice could relate to self-regulatory capacity in monitoring their understanding ( Aleven et al. , 2003 ). Students who did not take a hint may have felt confident in their problem-solving ability and thus chose not to view additional information they felt they already knew. In a study that examined students’ use of three-dimensional molecular models to assist in drawing molecular representations, some students did not use models even when the models were placed directly into their hands ( Stull et al. , 2012 ). Some of these students reported thinking they did not need the models to answer the given questions ( Stull et al. , 2012 ). This supports the idea that students who do not use provided hints may simply feel they do not need them. On the other hand, 29% of the students in our study who did not take a hint answered the first question incorrectly, indicating their confidence was misplaced. Similarly, in a study that offered computer-tailored hints for solving problems, even students predicted to benefit from hints did not always take them ( Aleven et al. , 2006 ). In the current study, due to the constructed-response nature of the questions, students could not receive immediate feedback on whether they correctly answered a question. Thus, there would be value in examining whether immediate feedback on performance would influence students’ future choices. Because we could not discover students’ rationales for not taking a hint in this study, we cannot make any further conclusions about their choices.

Utility of a Single Content Idea

We showed that the inclusion of just one content idea as a hint helped some initially struggling students understand a concept, potentially by activating their prior knowledge related to the hint content. In looking at these students’ problem solving, we found that students who improved in a given content trio (011 and 001 groups) more often used language similar to the content of the hint than students who consistently answered incorrectly in a given trio (000 group). Thus, for students helped by the hint, this particular piece of content likely was critical for correctly solving the problem. Adding to previous frameworks ( Alexander and Judy, 1988 ; Alexander et al. , 1989 ), we suggest that this declarative (content) knowledge is the component of domain-specific knowledge that is needed to effectively apply procedural (e.g., strategic) knowledge to accurately solve a problem. In future studies, we plan to further explore the details of students’ procedural processes during problem solving and to determine whether a student’s inability to recall a piece of information is the main reason for an incorrect answer or whether there are additional higher-order cognitive skills and processes required for correct problem solving.

Some students continued to answer all questions in a content trio incorrectly (000 group) despite a content hint. These students often had multiple gaps in content knowledge or made content errors or omissions not related to the content hint. In future studies, students could receive tailored content hint(s) to match all errors that are present; this could allow us to determine whether the lack of content is the reason for incorrect answers, rather than a lack of procedural process skills. In one previous study, a computer program for solving problems that provides tailored hints and feedback was used to specifically assist in genetics problem solving, providing up to four hints specific to each component of a given problem (the Genetics Cognitive Tutor; Corbett et al. , 2010 ). The authors found a significant improvement in learning from pre- to postcompletion of this program ( Corbett et al. , 2010 ).

In cases in which students consistently answered incorrectly (000 group), some used language related to the content hint but made errors when trying to apply the hint in their explanations. If students have inaccurate knowledge on how to apply content, even when correct content ideas are provided, a hint may be insufficient. Indeed, Smith (1988) found that unsuccessful problem solvers can often identify important pieces of information but do not know how to apply this information. In this case, providing more scaffolding to a student, such as by providing students with worked examples of similar problems (e.g., Sweller and Cooper, 1985 ; Renkl and Atkinson, 2010 ) or providing more guided hints and feedback via a cognitive tutor (e.g., Corbett et al. , 2010 ), may be needed.

These students who consistently answer incorrectly may also be lacking critical problem-solving skills. In this study, we focused on the use and application of content knowledge, but in future studies, we will examine the problem-solving processes taken by students who answer correctly and compare these with the processes used by students who answer incorrectly. Certain skills may be particularly critical, such as displaying metacognitive ability (the knowledge and regulation of one’s own cognition). Activating prior knowledge by identifying similarities between problems is an effective metacognitive skill to help orient oneself to a problem ( Gick and Holyoak, 1980 ; Smith, 1988 ; Meijer et al. , 2006 ), and using this behavior in combination with several other metacognitive skills, including planning and checking work, can improve problem-solving ability ( Mevarech and Amrany, 2008 ). Thus, a prompt that asks students to explain how the content in a hint is related to information the student has used previously to solve a problem may be helpful, as it may elicit their prior knowledge of solving similar problems.

Content-Specific Errors and Omissions

Recombination..

For the topic of recombination, students who answered consistently incorrectly (000 group) did not often use map units to determine the probability of offspring when considering two linked genes; instead, many students attempted to solve the problem using Punnett squares and/or the logic of solving a probability question for genes on different chromosomes. Even when students used map units, they often either performed incorrect calculations or assigned recombinant probabilities to the incorrect genotypes. This suggests that the conceptual idea behind calculating probability of inheritance using linked genes is challenging.

Probability.

Students struggled in calculating the probability that an unaffected child of two heterozygotes would be a heterozygote. Instead of considering information in the pedigree that would allow them to eliminate one of the genotype possibilities (homozygous recessive), students often assumed that the probability of a heterozygote offspring of carriers would be 1/2 rather than 2/3. For students who answered these questions consistently incorrectly (000 group), the most common error included the combination of not using the probability of 2/3 with failing to use the product rule appropriately to account for multiple generations. This suggests that struggling students do not understand the broader concept of how to consider multiple generations when determining probability and thus have difficulty integrating multiple ideas into their solutions. Indeed, previous work has shown that many students have difficulty in using both of these types of calculations ( Smith, 1988 ; Smith and Knight, 2012 ).

Gel/pedigree.

Students who answered consistently incorrectly (000 group) most frequently displayed difficulty in reading the gel to identify the number of allele copies and then connecting this information to the pedigree. In this course, students were taught that, although gels are not always quantitative, one can use the thickness of bands on a DNA gel to determine the relative amounts of DNA present in a sample. Despite being taught this convention, students still had difficulty applying the concept of both allele number (e.g., only one X chromosome allele for a male) and amount of DNA (e.g., a thicker band representing two of the same alleles for an individual). Thus, students need more practice understanding the concept of interpreting information on gels.

Nondisjunction.

In nondisjunction questions, students who consistently answered incorrectly (000 group) had a diversity of misunderstandings about meiosis, with three errors being most common. The nondisjunction questions explicitly asked students to specify a phase in meiosis, if any, that was affected. However, students often failed to consider in which meiotic division, I or II, an error could occur, or they expressed uncertainty about differentiating between the two phases of meiosis. Students also struggled with identifying when during meiosis homologous versus sister chromatids separate; they sometimes attempted to identify the type of chromosome that was failing to separate or to state when each would normally separate, but they were often incorrect. The third error students made represented a general misunderstanding of meiosis in which students incorrectly identified the number of each chromosome that should be present in a gamete, or students assumed an atypical event, such as multiple rounds of replication, must have occurred to produce a gamete with one extra chromosome. Previous work on this topic also found that students demonstrate many errors when depicting meiosis, including incorrect chromosome alignment during metaphase ( Wright and Newman, 2011 ; Newman et al. , 2012 ).

As with previous studies that report on the testing effect (e.g., Adesope et al. , 2017 ; Butler, 2010 ), we found that practice was associated with later assessment performance. Regardless of practice Q1 performance and GCA pretest performance, student success in practice predicted students’ longer-term performance, both practice related and practice unrelated, on their instructor-generated final exam and GCA scores in a course. We also showed that those students who completed both practice assignments performed better than students who did not complete any practice assignments, controlling for GCA pretest performance. Because we could not randomize students into practice or no-practice conditions, we caution that, even though we used the GCA pretest as a proxy for incoming ability, there are likely many other factors influencing these students’ performance. Other factors shown to relate to success include student motivation, interest, and metacognition (e.g., Pintrich and de Groot, 1990 ; Schiefele et al. , 1992 ; Young and Fry, 2008 ).

Limitations

Our study addressed four critical content areas in genetics with which we know students struggle. However, students likely have additional or different difficulties on other genetics content. In addition, these questions had only a single correct answer and thus may have been limited in their ability to test student problem-solving skills. In the future, we would like to examine more ill-defined questions with multiple possible solutions ( National Research Council, 2012 ).

While we anticipated that most students would take the option to receive a hint, only 68% of students did so. To provide an accurate representation of the influence of a hint, we had to limit our analyses to those who chose to take a hint. As seen in Stull and colleagues’ ( 2012 ) work on molecular model use and as suggested by our data examining use of the content language reflected in the hint, not all students are likely to use hints, even when hints are easily available. However, it would be interesting to know why students who choose not to take a hint make that decision and whether this decision is based on high confidence or fear that the hint may confuse them.

We also could not test directly whether students who took a hint performed better than those who did not take a hint in longer-term performance, as the only way to measure this is to randomize the students who do and do not receive a hint. We chose not to take this approach, because we felt it was important for student success to give everyone the same access to information.

Implications for Instruction

This study suggests that, after learning a topic in class, a subset of students who initially give incorrect answers to problems on these topics can improve after receiving a single content idea that may fill a knowledge gap. Some students may generally understand how to solve these problems but lack one or two pieces of information; providing the missing piece allows them to apply their knowledge and solve the problem. For these students, reviewing certain pieces of genetics content, which we describe in this study, may be enough to help them solve such problems correctly. Furthermore, we suggest emphasizing the importance of practicing, as this study showed that success at the end of practice predicts longer-term performance in a class, regardless of initial understanding of genetics topics. Even if a student initially struggles with an answer, this “productive failure” can be beneficial to the student’s learning ( Kapur and Bielaczyc, 2012 ). Students who continue to struggle despite content hints likely lack content knowledge as well as problem-solving skills. We plan to further examine such deficits in how students solve problems in order to provide suggestions that are focused on the logical steps and metacognitive processes necessary for solving problems. Such instruction may be most beneficial after students have an initial chance to practice problems, so that they have a chance to challenge themselves before receiving hints.

Supplementary Material

Acknowledgments.

This work was supported by the National Science Foundation (DUE 1711348). We thank Oscar Whitney for assistance with initial development and testing of questions and Ashton Wiens and Felix Jimenez for assistance with statistical analyses. We are also grateful to Paula Lemons, Stephanie Gardner, and Laura Novick for their advice on the project and to all of the students who participated in this study.

  • Adesope O. O., Trevisan D. A., Sundararajan N. (2017). Rethinking the use of tests: A meta-analysis of practice testing . Review of Educational Research , ( 3 ), 659–701. 10.3102/0034654316689306 [ CrossRef ] [ Google Scholar ]
  • Aleven V., McLaren B., Roll I., Koedinger K. (2006). Toward meta-cognitive tutoring: A model of help-seeking with a cognitive tutor . International Journal of Artificial Intelligence in Education , , 101–130. [ Google Scholar ]
  • Aleven V., Stahl E., Schworm S., Fischer F., Wallace R. (2003). Help seeking and help design in interactive learning environments . Review of Educational Research , ( 3 ), 277–320. 10.3102/00346543073003277 [ CrossRef ] [ Google Scholar ]
  • Alexander P. A., Judy J. E. (1988). The interaction of domain-specific and strategic knowledge in academic performance . Review of Educational Research , ( 4 ), 375–404. 10.3102/00346543058004375 [ CrossRef ] [ Google Scholar ]
  • Alexander P. A., Pate P. E., Kulikowich J. M., Farrell D. M., Wright N. L. (1989). Domain-specific and strategic knowledge: Effects of training on students of differing ages or competence levels . Learning and Individual Differences , ( 3 ), 283–325. 10.1016/1041-6080(89)90014-9 [ CrossRef ] [ Google Scholar ]
  • American Association for the Advancement of Science. (2011). Vision and change in undergraduate biology education: A call to action . Washington, DC. [ Google Scholar ]
  • Bassok M., Novick L. R. (2012). Problem solving . In Holyoak K. J., Morrison R. G. (Eds.), Oxford handbook of thinking and reasoning (pp. 413–432). New York: Oxford University Press. [ Google Scholar ]
  • Bloom B. S., Engelhart M. D., Furst E. J., Hill W. M., Krathwohl D. R. (1956). Taxonomy of educational objectives: The classification of educational goals . New York: David McKay. [ Google Scholar ]
  • Bransford J. D., Johnson M. K. (1972). Contextual prerequisites for understanding: Some investigations of comprehension and recall . Journal of Verbal Learning and Verbal Behavior , ( 6 ), 717–726. 10.1016/S0022-5371(72)80006-9 [ CrossRef ] [ Google Scholar ]
  • Butler A. C. (2010). Repeated testing produces superior transfer of learning relative to repeated studying . Journal of Experimental Psychology. Learning, Memory, and Cognition , ( 5 ), 1118–1133. 10.1037/a0019902 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chi M. T. H. (1981). Knowledge development and memory performance . In Friedman M. P., Das J. P., O’Connor N. (Eds.), Intelligence and learning (pp. 221–229). Boston: Springer US; 10.1007/978-1-4684-1083-9_20 [ CrossRef ] [ Google Scholar ]
  • Chi M. T. H., Feltovich P. J., Glaser R. (1981). Categorization and representation of physics problems by experts and novices . Cognitive Science , ( 2 ), 121–152. 10.1207/s15516709cog0502_2 [ CrossRef ] [ Google Scholar ]
  • Corbett A., Kauffman L., Maclaren B., Wagner A., Jones E. (2010). A Cognitive Tutor for genetics problem solving: Learning gains and student modeling . Journal of Educational Computing Research , ( 2 ), 219–239. [ Google Scholar ]
  • Dooling D. J., Lachman R. (1971). Effects of comprehension on retention of prose . Journal of Experimental Psychology , , 216. [ Google Scholar ]
  • Gick M. L., Holyoak K. J. (1980). Analogical problem solving . Cognitive Psychology , ( 3 ), 306–355. 10.1016/0010-0285(80)90013-4 [ CrossRef ] [ Google Scholar ]
  • Kapur M., Bielaczyc K. (2012). Designing for productive failure . Journal of the Learning Sciences , ( 1 ), 45–83. 10.1080/10508406.2011.591717 [ CrossRef ] [ Google Scholar ]
  • Landis J. R., Koch G. G. (1977). The measurement of observer agreement for categorical data . Biometrics , ( 1 ), 159–174. [ PubMed ] [ Google Scholar ]
  • Meijer J., Veenman M. V. J., van Hout-Wolters B. H. A. M. (2006). Metacognitive activities in text-studying and problem-solving: Development of a taxonomy . Educational Research and Evaluation , ( 3 ), 209–237. 10.1080/13803610500479991 [ CrossRef ] [ Google Scholar ]
  • Mevarech Z. R., Amrany C. (2008). Immediate and delayed effects of meta-cognitive instruction on regulation of cognition and mathematics achievement . Metacognition and Learning , ( 2 ), 147–157. 10.1007/s11409-008-9023-3 [ CrossRef ] [ Google Scholar ]
  • National Research Council. (2012). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering . Washington, DC: National Academies Press; 10.17226/13362 [ CrossRef ] [ Google Scholar ]
  • Newman D. L., Catavero C. M., Wright L. K. (2012). Students fail to transfer knowledge of chromosome structure to topics pertaining to cell division . CBE—Life Sciences Education , ( 4 ), 425–436. 10.1187/cbe.12-01-0003 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Novick L. R., Bassok M. (2005). Problem solving . In Holyoak K. J., Morrison R. G. (Eds.), The Cambridge handbook of thinking and reasoning (pp. 321–349). New York: Cambridge University Press. [ Google Scholar ]
  • Pintrich P. R., de Groot E. V. (1990). Motivational and self-regulated learning components of classroom academic performance . Journal of Educational Psychology , ( 1 ), 33–40. 10.1037/0022-0663.82.1.33 [ CrossRef ] [ Google Scholar ]
  • Pol H. J., Harskamp E. G., Suhre C. J. M., Goedhart M. J. (2008). The effect of hints and model answers in a student-controlled problem-solving program for secondary physics education . Journal of Science Education and Technology , ( 4 ), 410–425. 10.1007/s10956-008-9110-x [ CrossRef ] [ Google Scholar ]
  • Prevost L. B., Lemons P. P. (2016). Step by step: Biology undergraduates’ problem-solving procedures during multiple-choice assessment . CBE—Life Sciences Education , ( 4 ), ar71 10.1187/cbe.15-12-0255 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Renkl A., Atkinson R. K. (2010). Learning from worked-out examples and problem solving . In Plass J. L., Moreno R., Brünken R. (Eds.), Cognitive load theory (pp. 91–108). New York: Cambridge University Press. [ Google Scholar ]
  • Roediger H. L., Karpicke J. D. (2006). Test-enhanced learning: Taking memory tests improves long-term retention . Psychological Science , ( 3 ), 249–255. 10.1111/j.1467-9280.2006.01693.x [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Schiefele U., Krapp A., Winteler A. (1992). Interest as a predictor of academic achievement: A meta-analysis of research . In Renninger K. A., Hidi S., Krapp A. (Eds.), The role of interest in learning and development (pp. 183–212). Hillsdale, NJ: Erlbaum. [ Google Scholar ]
  • Smith J. I., Combs E. D., Nagami P. H., Alto V. M., Goh H. G., Gourdet M. A. A., … Tanner K. D. (2013). Development of the biology card sorting task to measure conceptual expertise in biology . CBE—Life Sciences Education , ( 4 ), 628–644. 10.1187/cbe.13-05-0096 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Smith M. K., Knight J. K. (2012). Using the Genetics Concept Assessment to document persistent conceptual difficulties in undergraduate genetics courses . Genetics , ( 1 ), 21–32. 10.1534/genetics.111.137810 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Smith M. K., Wood W. B., Knight J. K. (2008). The Genetics Concept Assessment: A new concept inventory for gauging student understanding of genetics . CBE—Life Sciences Education , ( 4 ), 422–430. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Smith M. U. (1988). Successful and unsuccessful problem solving in classical genetic pedigrees . Journal of Research in Science Teaching , ( 6 ), 411–433. 10.1002/tea.3660250602 [ CrossRef ] [ Google Scholar ]
  • Smith M. U., Good R. (1984). Problem solving and classical genetics: Successful versus unsuccessful performance . Journal of Research in Science Teaching , ( 9 ), 895–912. 10.1002/tea.3660210905 [ CrossRef ] [ Google Scholar ]
  • Stull A. T., Hegarty M., Dixon B., Stieff M. (2012). Representational translation with concrete models in organic chemistry . Cognition and Instruction , ( 4 ), 404–434. 10.1080/07370008.2012.719956 [ CrossRef ] [ Google Scholar ]
  • Sweller J., Cooper G. A. (1985). The use of worked examples as a substitute for problem solving in learning algebra . Cognition and Instruction , ( 1 ), 59–89. 10.1207/s1532690xci0201_3 [ CrossRef ] [ Google Scholar ]
  • Wright L. K., Newman D. L. (2011). An interactive modeling lesson increases students’ understanding of ploidy during meiosis . Biochemistry and Molecular Biology Education , ( 5 ), 344–351. 10.1002/bmb.20523 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Young A., Fry J. D. (2008). Metacognitive awareness and academic achievement in college students . Journal of the Scholarship of Teaching and Learning , ( 2 ), 1–10. [ Google Scholar ]

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Biology LibreTexts

Exercises: Genetics (Hardison)

  • Last updated
  • Save as PDF
  • Page ID 6903

These are homework exercises to accompany Hardison's " Working with Molecular Genetics " TextMap. Genetics is the study of genes, genetic variation, and heredity in living organisms.

  • 1.E: Fundamental Properties of Genes (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 3.E: Isolating and Analyzing Genes (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 4.E: Genomes and Chromosomes (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 5.E: DNA replication I: Enzymes and Mechanism (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 6.E: DNA replication II: Start, stop and control (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 7.E : Mutation and Repair of DNA (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 8.E: Recombination of DNA (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 9.E: Transposition of DNA (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 10.E: Transcription: RNA polymerases (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 11.E: Transcription: Promoters, terminators and mRNA (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 12.E: RNA Processing (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 13.E: Genetic Code (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 14.E: Translation - Protein synthesis (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 15.E: Positive and negative control of gene expression (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 16.E: Transcription regulation via effects on RNA polymerases (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 18.E: Transcriptional regulation after initiation (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 19.E: Transcriptional regulation in eukaryotes (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.
  • 20.E: Transcriptional regulation via chromatin alterations (Exercises) Problems for the Textmap "Genetics" by Ross Hardison.

Contributors and Attributions

Ross C. Hardison , T. Ming Chu Professor of  Biochemistry and Molecular Biology  ( The Pennsylvania State University )

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Successful Problem Solving in Genetics Varies Based on Question Content

  • Jennifer S. Avena
  • Betsy B. McIntosh
  • Oscar N. Whitney
  • Ashton Wiens
  • Jennifer K. Knight

Department of Molecular, Cellular, and Developmental Biology

School of Education, University of Colorado Boulder, Boulder, CO 80309

Search for more papers by this author

Department of Applied Mathematics, University of Colorado Boulder, Boulder, CO 80309

*Address correspondence to: Jennifer Knight ( E-mail Address: [email protected] ).

Problem solving is a critical skill in many disciplines but is often a challenge for students to learn. To examine the processes both students and experts undertake to solve constructed-response problems in genetics, we collected the written step-by-step procedures individuals used to solve problems in four different content areas. We developed a set of codes to describe each cognitive and metacognitive process and then used these codes to describe more than 1800 student and 149 expert answers. We found that students used some processes differently depending on the content of the question, but reasoning was consistently predictive of successful problem solving across all content areas. We also confirmed previous findings that the metacognitive processes of planning and checking were more common in expert answers than student answers. We provide suggestions for instructors on how to highlight key procedures based on each specific genetics content area that can help students learn the skill of problem solving.

INTRODUCTION

The science skills of designing and interpreting experiments, constructing arguments, and solving complex problems have been repeatedly called out as critical for undergraduate biology students to master ( American Association for the Advancement of Science, 2011 ). Yet each of these skills remains elusive for many students, particularly when the skill requires integrating and evaluating multiple pieces of information ( Novick and Bassok, 2005 ; Bassok and Novick, 2012 ; National Research Council, 2012 ). In this paper, we focus on describing the steps students and experts take while solving genetics problems and determining whether the use of certain processes increases the likelihood of success.

The general process of solving a problem has been described as building a mental model in which prior knowledge can be used to represent ways of thinking through a problem state ( Johnson-Laird, 2010 ). Processes used in problem solving have historically been broken down into two components: those that use domain-general knowledge and those that use domain-specific knowledge. Domain-general knowledge is defined as information that can be used to solve a problem in any field, including such strategies as rereading and identifying what a question is asking ( Alexander and Judy, 1988 ; Prevost and Lemons, 2016 ). Although such steps are important, they are unlikely to be the primary determinants of success when specific content knowledge is required. Domain-specific problem solving, on the other hand, is a theoretical framework that considers one’s discipline-specific knowledge and processes used to solve a problem (e.g., Prevost and Lemons, 2016 ). Domain-specific knowledge includes declarative (knowledge of content), procedural (how to utilize certain strategies), and conditional knowledge (when and why to utilize certain strategies) as they relate to a specific discipline ( Alexander and Judy, 1988 ; Schraw and Dennison, 1994 ; Prevost and Lemons, 2016 ).

Previous studies on problem solving within a discipline have emphasized the importance of domain-specific declarative and conditional knowledge, as students need to understand and be able to apply relevant content knowledge to successfully solve problems ( Alexander et al. , 1989 ; Alexander and Judy, 1988 ; Prevost and Lemons, 2016 ). Our prior work ( Avena and Knight 2019 ) also supported this necessity. After students solved a genetics problem within a content area, they were offered a content hint on a subsequent content-matched question. We found that content hints improved performance overall for students who initially did not understand a concept. In characterizing the students’ responses, we found that the students who benefited from the hint typically used the content language of the hint in their solution. However, we also found that some students who continued to struggle included the content language of the hint but did not use the information in their problem solutions. For example, in solving problems on predicted recombination frequency for linked genes, an incorrect solution might use the correct terms of map units and/or recombination frequency but not actually use map units to solve the problem. Thus, these findings suggest that declarative knowledge is necessary but not sufficient for complex problem solving and also emphasize the importance of procedural knowledge, which includes the “logic” of generating a solution ( Avena and Knight, 2019 ). By definition, procedural knowledge uses both cognitive processes, such as providing reasoning for a claim or executing a task, and metacognitive processes, such as planning how to solve a problem and checking (i.e., evaluating) one’s work (e.g., Kuhn and Udell, 2003 ; Meijer et al. , 2006 ; Tanner, 2012 ). We explore these processes in more detail below.

Cognitive Processing: Reasoning

Generating reasoning requires using one’s knowledge to search for and explain an appropriate set of ideas to support or refute a given model ( Johnson-Laird, 2010 ), so reasoning is likely to be a critical component of solving problems. Toulmin’s original scheme for building a scientific argument ( Toulmin, 1958 ) included generating a claim, identifying supporting evidence, and then using reasoning (warrant) to connect the evidence to the claim. Several studies have demonstrated a positive relationship between general reasoning “ability” ( Lawson, 1978 ), defined as the ability to construct logical links between evidence and conclusions using conceptual principles, and performance ( Cavallo, 1996 ; Cavallo et al. , 2004 ; Johnson and Lawson, 1998 ). As elaborated in more recent literature, there are many specific subcategories of reasoning. Students commonly use memorized patterns or formulas to solve problems: this approach is considered algorithmic and could be used to provide logic for a problem ( Jonsson et al. , 2014 ; Nyachwaya et al. , 2014 ). Such algorithmic reasoning may be used with or without conveying an understanding of how an algorithm is used ( Frey et al. , 2020 ). When an algorithm is not appropriate (or not used) in describing one’s reasoning, but instead the solver provides a generalized explanation of underlying connections, this is sometimes referred to as “explanatory” or “causal” reasoning ( Russ et al. , 2008 ). Distinct from causal reasoning is the domain-specific form of mechanistic reasoning, in which a mechanism of action of a biological principle is elaborated ( Russ et al. , 2008 ; Southard et al. , 2016 ). Another common form of reasoning is quantitative reasoning, which can also be described as statistical or, in other specialized situations, graph-construction reasoning (e.g., Deane et al. , 2016 ; Angra and Gardner, 2018 ). The detailed studies of these specific subcategories of reasoning have usually involved extensive interviews with students and/or very specific guidelines that prompt the use of a particular type of reasoning. Those who have explored students’ unprompted general use of reasoning have found that few students naturally use reasoning to support their ideas ( Zohar and Nemet, 2002 ; James and Willoughby, 2011 ; Schen, 2012 ; Knight et al. , 2015 ; Paine and Knight, 2020 ). However, with explicit training to integrate their knowledge into mental models ( Kuhn and Udell, 2003 ; Osborne, 2010 ) or with repeated cueing from instructors ( Russ et al. , 2008 ; Knight et al. , 2015 ), students can learn to generate more frequent, specific, and robust reasoning.

Metacognitive Processing

Successfully generating possible solutions to problems likely also involves metacognitive thinking . Metacognition is often separated into two components: metacognitive knowledge (knowledge about one’s own understanding and learning) and metacognitive regulation (the ability to change one’s approach to learning; Flavell, 1979 ; Jacobs and Paris, 1987 ; Schraw and Moshman, 1995 ). Metacognitive regulation is usually defined as including such processes as planning, monitoring one’s progress, and evaluating or checking an answer ( Flavell, 1979 ; Jacobs and Paris, 1987 ; Schraw and Moshman, 1995 ; Tanner, 2012 ). Several studies have shown that helping students use metacognitive strategies can benefit learning. For example, encouraging the planning of a possible solution beforehand and checking one’s work afterward helps students generate correct answers during problem solving (e.g., Mevarech and Amrany, 2008 ; McDonnell and Mullally, 2016 ; Stanton et al. , 2015 ). However, especially compared with experts, students rarely use metacognitive processes, despite their value ( Smith and Good, 1984 ; Smith, 1988 ). Experts spend more time orienting, planning, and gathering information before solving a problem than do students, suggesting that experts can link processes that facilitate generating a solution with their underlying content knowledge ( Atman et al. , 2007 ; Peffer and Ramezani, 2019 ). Experts also check their problem-solving steps and solutions before committing to an answer, steps not always seen in student responses ( Smith and Good, 1984 ; Smith, 1988 ). Ultimately, prior work suggests that, even when students understand content and employ appropriate cognitive processes, they may still struggle to solve problems that require reflective and regulative skills.

Theoretical Framework: Approaches to Learning

Developing domain-specific conceptual knowledge requires integrating prior knowledge and new disciplinary knowledge ( Schraw and Dennison, 1994 ). In generating conceptual knowledge, students construct mental models in which they link concepts together to generate a deeper understanding ( Johnson-Laird, 2001 ). These mental constructions involve imagining possible relationships and generating deductions and can be externalized into drawn or written models for communicating ideas ( Chin and Brown, 2000 ; Bennett et al. , 2020 ). Mental models can also trigger students to explain their ideas to themselves (self-explanation), which can also help them solve problems ( Chi et al. , 1989 ).

As our goal is to make visible how students grapple with their knowledge during problem solving, we fit this study into the approaches to learning framework (AtL: Chin and Brown, 2000 ). This framework, derived from detailed interviews of middle-school students solving chemistry problems, defines five elements of how students approach learning and suggests that these components promote deeper learning. Three of these elements are identifiable in the current study: engaging in explanations (employing reasoning through understanding and describing relationships and mechanisms), using generative thinking (application of prior knowledge and analogical transfer), and engaging in metacognitive activity (monitoring progress and modifying approaches). The remaining two elements: question asking (focusing on facts or on understanding) and depth of approaching tasks (taking a deep or a surface approach to learning: Biggs, 1987 ) could not be addressed in our study. However, previous studies showed that students who engage in a deep approach to learning also relate new information to prior knowledge and engage in reasoning (explanations), generate theories for how things work (generative thinking), and reflect on their understanding (metacognitive activity). In contrast, those who engage in surface approaches focus more on memorized, isolated facts than on constructing mental or actual models, demonstrating an absence of the three elements described by this framework. Biggs (1987) also previously provided evidence that intrinsically motivated learners tended to use a deep approach, while those who were extrinsically motivated (e.g., by grades), tended to use a surface approach. Because solving complex problems is, at its core, about how students engage in the learning process, these AtL components helped us frame how students’ learning is revealed by their own descriptions of their thinking processes.

Characterizing Problem-Solving Processes

Thus far, a handful of studies have investigated the processes adult students use in solving biology problems, and how these processes might influence their ability to develop reasonable answers ( Smith and Good, 1984 ; Smith, 1988 ; Nehm, 2010 ; Nehm and Ridgway, 2011 ; Novick and Catley, 2013 ; Prevost and Lemons, 2016 ; Sung et al. , 2020 ). In one study, Prevost and Lemons (2016) collected and analyzed students’ written documentation of their problem-solving procedures when answering multiple-choice questions. Students were taught to document their step-by-step thinking as they answered multiple-choice exam questions that ranged from Bloom’s levels 2 to 4 (understand to analyze; Bloom et al. , 1956 ), describing the steps they took to answer each question. The authors’ qualitative analyses of students’ documented problem solving showed that students frequently used domain-general test-taking skills, such as comparing the language of different multiple-choice distractors. However, students who correctly answered questions tended to use more domain-specific procedures that required knowledge of the discipline, such as analyzing visual representations and making predictions, than unsuccessful students. When students solved problems that required the higher-order cognitive skills of application and analysis, they also used more of these specific procedures than when solving lower-level questions. Another recent study explored how students solved exam questions on the genetic topics of recombination and nondisjunction through in-depth clinical interviews ( Sung et al. , 2020 ). These authors described two approaches that are not conceptual: using algorithms to bypass conceptual thinking and using non–biology specific test-taking strategies (e.g., length of answer, specificity of terminology). They also showed that students sometimes alternate between using an algorithm and a conceptual strategy, defaulting to the algorithm when they do not understand the underlying biological concept.

Research Question 1. How do experts and students differ in their description of problem-solving processes, using a much larger sample size than found in the previous literature (e.g., Chi et al. , 1981 ; Smith and Good, 1984 ; Smith, 1988 ; Atman et al. , 2007 ; Peffer and Ramezani, 2019 ).

Research Question 2. Are certain problem-solving processes more likely to be used in correct than in incorrect student answers?

Research Question 3. Do problem-solving processes differ based on content and are certain combinations of problem-solving processes associated with correct student answers for each content area?

Mixed-Methods Approach

This study used a mixed-methods approach, combining both qualitative and quantitative research methods and analysis to understand a phenomenon more deeply ( Johnson et al. , 2007 ). Our goal was to make student thinking visible by collecting written documentation of student approaches to solving problems (qualitative data), in addition to capturing answer correctness (quantitative data), and integrating these together in our analyses. The student responses serve as a rich and detailed data set that can be interpreted using the qualitative process of assigning themes or codes to student writing ( Hammer and Berland, 2014 ). In a qualitative study, the results of the coding process are unpacked using examples and detailed descriptions to communicate the findings. In this study, we share such qualitative results but also convert the coded results into numerical representations to demonstrate patterns and trends captured in the data. This is particularly useful in a large-scale study, because the output can be analyzed statistically to allow comparisons between categories of student answers and different content areas.

Students in this study were enrolled in an introductory-level undergraduate genetics course for biology majors at the University of Colorado in Spring 2017 ( n = 416). This course is the second in a two-course introductory series, with the first course being Introduction to Cell and Molecular Biology. The students were majority white, 60% female, and 63% were in their first or second year. Ninety percent of the students were majoring in biology or a biology-related field (neuroscience, integrative physiology, biochemistry, biomedical engineering). Of the students enrolled in the course, 295 students consented to be included in the study; some of the student responses have been previously described in the prior study ( Avena and Knight, 2019 ). We recruited experts from the Society for the Advancement of Biology Education Research Listserv by inviting graduate students, postdoctoral fellows, and faculty to complete an anonymous online survey consisting of the same questions that students answered. Of the responses received, we analyzed responses from 52 experts. Due to the anonymous nature of the survey, we did not collect descriptive data about the experts.

Problem Solving

As part of normal course work, students were offered two practice assignments covering four content areas related to each of two course exams (also described in Avena and Knight, 2019 ). Students could answer up to nine questions in blocks of three questions each, in randomized order, for three of the four content areas. Expert participants answered a series of four questions, one in each of the four content areas. All questions were offered online using the survey platform Qualtrics. All participants were asked to document their problem-solving processes as they completed the questions (as in Prevost and Lemons 2016 ), and they were provided with written instructions and an example in the online platform only (see Supplemental Material); no instructions were provided in class, and no explicit discussion of types of problem-solving processes to use were provided in class throughout the semester. Students could receive extra credit up to ∼1% of the course point total, obtaining two-thirds credit for explaining their answer and an additional one-third if they answered correctly. All students who completed the assignment received credit regardless of their consent to participate in the research.

We used questions developed for a prior study ( Avena and Knight, 2019 ) on four challenging genetics topics: calculation of the probability of inheritance across multiple generations (Probability), prediction of the cause of an incorrect chromosome number after meiosis (Nondisjunction), interpretation of a gel and pedigree to determine inheritance patterns (Gel/Pedigree), and prediction of the probability of an offspring’s genotype using linked genes (Recombination; see example in Figure 1 ; all questions presented in Supplemental Material). These content areas have previously been shown to be challenging based on student performance ( Smith et al. , 2008 ; Smith and Knight, 2012 ; Avena and Knight, 2019 ). Each content area contained three isomorphic questions that addressed the same underlying concept, targeted higher-order cognitive processes ( Bloom et al. , 1956 ), and contained the same amount of information with a visual ( Avena and Knight, 2019 ). Each question had a single correct answer and was coded as correct (1) or incorrect (0). For each problem-solving assignment, we randomized 1) the order of the three questions within each content area for each student and 2) the order in which each content area was presented. During each set of three isomorphic questions, while solving one of the isomorphic problems, students also had the option to receive a “content hint,” a single most commonly misunderstood fact for each content area. We do not discuss the effects of the content hints in this paper (instead, see Avena and Knight, 2019 ).

FIGURE 1. Sample problem for students from the Gel/Pedigree content area. Problems in each content area contain a written prompt and an illustrated image, as shown in this example.

Process Coding

Students may engage in processes that they do not document in writing, but we are limited to analyzing only what they do provide in their written step-by-step descriptions. For simplicity, throughout this paper, a “process” is a thought documented by the participant that is coded as a particular process. When we refer to “failure” to use a process, we mean that a participant did not describe this thought process in the answer. Our initial analysis of student processes used a selection of codes from Prevost and Lemons (2016) and Toulmin’s ( 1958 ) original codes of Claim and Reason. We note that all the problems we used can potentially be solved using algorithms, memorized patterns previously discussed and practiced in the class, which may have limited the reasoning students supplied. Because of the complexity of identifying different types of reasoning, we did not further subcategorize the reasoning category in the scheme we present, as this is beyond the scope of this paper. We used an emergent coding process ( Saldana, 2015 ) to identify additional and different processes, including both cognitive and metacognitive actions. Thus, our problem-solving processes (PsP) coding scheme captures the thinking that students document while solving genetics problems (see individual process codes in Table 1 ). We used HyperRESEARCH software (ResearchWare, Inc.) to code each individual’s documented step-by-step processes. A step was typically a sentence and sometimes contained multiple ideas. Each step was given one or more codes, with the exception of reasoning supporting a final conclusion (see Table 2 for examples of coded responses). Each individual process code captures when the student describes that process, regardless of whether the statement is correct or incorrect. Four raters (J.K.K., J.S.A., O.N.W., B.B.M.) coded a total of 24 student answers over three rounds of coding and discussion to reach consensus and identify a final coding scheme. Following agreement on the codes, an additional 12 answers were coded by the four raters to determine interrater agreement. Specifically, in these 12 answers, there were 150 instances in which a code for a step was provided by one or more raters. For each of these 150 instances, we identified the number of raters who agreed. We then calculated a final interrater agreement of 83% by dividing the total number of raters who agreed for all 150 instances (i.e., 524) by the total number of possible raters to agree for four raters in 150 instances (i.e., 600). We excluded answers in which students did not describe their problem-solving steps and those in which students primarily or exclusively used domain-general processes (i.e., individual process codes within the General strategy category in Table 1 ) or made claims without any other supporting codes. The latter two exclusion criteria were used because such responses lacked sufficient description to identify the thought processes. The final data set included a total of 1853 answers from 295 students and 149 answers from 52 experts. We used only correct answers from experts to serve as a comparison to student answers, excluding an additional 29 expert answers that were incorrect.

aExamples of student responses are to a variety of content areas and have been edited for clarity. Each individual process code captures the student’s description, regardless of whether the statement is correct or incorrect.

aThe responses above are all solutions to the question in Figure 1 .

After initial coding and analyses, we identified that student use of drawing was differentially associated with correctness based on content area. Thus, to further characterize drawing use, two raters (J.S.A. and J.K.K.) explored incorrect student answers from Probability and Recombination. One rater examined 33 student answers to identify an initial characterization, and then two raters reviewed a subset of answers to agree upon a final scheme. Each rater then individually categorized a portion of the student answers, and the final interrater agreement on 10 student answers was 90%. Interrater agreement was calculated as described earlier, with each answer serving as one instance, so we divided the total number of raters agreeing for each answer (i.e., 18) by the total possible number of raters agreeing (i.e., 20).

Statistical Analyses

The unit of analysis for all models considered is an individual answer to a problem. We investigate three variations of linear models, specified below. The response variable in all cases is binary (presence/absence of process or correct/incorrect answer). Thus, the models are generalized linear models, and, more specifically, logistic regression models. Because our data contain repeated measures in the form of multiple answers per student, we specifically use generalized linear mixed models (GLMM) to include a random effect on the intercept term in all models, grouped by participant identifier ( Gelman and Hill, 2006 ; Theobald, 2018 ). This component of the model accounts for variability in the baseline outcome between participants. In our case, we can model each student’s baseline probability of answering a problem correctly or each participant’s baseline probability of using a given process (e.g., one student may use Reason more frequently than another student). Accounting for this variation yields better estimates of the fixed effects in the models.

d solving simple genetic problems

The fitted models give some, but not all, pairwise comparisons among predictor groups. We conducted pairwise post hoc comparisons (e.g., expert vs. correct student, expert vs. incorrect student, correct student vs. incorrect student, or among the four content areas) to draw inferences about the differences among all groups. In particular, we performed Tukey pairwise honestly significant difference (HSD) tests for all pairs of groups, comparing estimated marginal means (estimated using the fitted model) on the logit scale. Using estimated marginal means corrects for unbalanced group sample sizes, and using the Tukey HSD test provides adjusted p values, facilitating comparison to a significance level of α = 0.05.

To ease reproducibility, we use “formula” notation conventionally used in R to specify the models we employ in this paper, which has the following general form: outcome = fixed effect + (1 | group). The random effects component is specified within parentheses, with the random effect on the left of the vertical bar and the grouping variable on the right.

Process present = Expert/Student answer status + (1| ID)

Process present = Content area + (1| ID)

where “Process present” is the response variable as described for model 1; “Content area” is the fixed effect: Factor-level grouping: Probability (1)/Nondisjunction (2)/Gel-Pedigree (3)/Recombination (4); and “(1|ID)” is the random effect as described for model 1.

Student answer correctness = Process 1 + Process 2 + … + Process X + (1| ID)

where “Student answer correctness” is the response variable: incorrect (0)/correct (1); “Process 1 + Process 2 + … + Process X” is the list of process factors entered into the model as the fixed effect: absent (0)/present (1); and “(1|ID)” is the random effect as described for models 1 and 2. We identified which components were associated with correctness by seeing which predictor coefficients remained non-zero in a representative lasso model. We identified a representative model for each content area by first identifying the lasso penalty with the lowest Akaike information criterion (AIC) to reduce variance and then identifying a lasso penalty with a similar AIC that could be used across all content areas. Because a penalty parameter of 25 and the penalty parameter with the lowest AIC for each content area had similar AIC values, we consistently used a penalty parameter of 25. Note that when the penalty parameter is set to zero, the GLMM model is recovered. On the other hand, when the penalty parameter is very large, no predictors are included in the model. Thus, the selected penalty parameter forced many, but not all, coefficients to 0, giving a single representative model for each content area.

All models and tests were performed in R (v. 3.5.1). We used the lme4 package in R ( Bates et al. , 2015 ) for models 1 and 2, and estimation of parameters was performed using residual maximum likelihood. For model 3, we used the glmmLasso package, and the model was fit using the default EM-type estimate. Post hoc pairwise comparisons were performed using the emmeans package.

Human Subjects Approval

Human research was approved by the University of Colorado Institutional Review Board (protocols 16-0511 and 15-0380).

The PsP Coding Scheme Helps Describe Written Cognitive and Metacognitive Processes

We developed a detailed set of codes, which we call the PsP scheme to characterize how individuals describe their solutions to complex genetics problems. Table 1 shows the 18 unique processes along with descriptions and examples for each. With the support of previous literature, we grouped the individual processes into seven strategies, also shown in Table 1 . All strategies characterized in this study were domain specific except the General category, which is domain general. We categorized a set of processes as Orientation based on a previously published taxonomy for think-aloud interviews ( Meijer et al. , 2006 ) and on information management processes from the Metacognitive Awareness Inventory ( Schraw and Dennison, 1994 ). Orienting processes include: Notice (identifying important information in the problem), Recall (activating prior knowledge without applying it), Identify Similarity (among question types), and Identify Concept (the “type” of problem). Orientation processes are relatively surface level, in that information is observed and noted, but not acted on. The Metacognition category includes the three common elements of planning (Plan), monitoring (Assess Difficulty), and evaluating (Check) cited in the metacognitive literature (e.g., Schraw and Moshman, 1995 ; Tanner, 2012 ). The Execution strategy includes actions taken to explicitly solve the problem, including Use Information (apply information related to the problem), Integrate (i.e., linking together two visual representations provided to solve the problem or linking a student’s own drawing to information in the problem), Draw, and Calculate. The Use Information category is distinguished from Recall by a student applying a piece of information (Use Information) rather than just remembering a fact without directly using it in the problem solution (Recall). Students may Recall and then Use Information, just Recall, or just Use Information. If a student used the Integrate process, Use Information was not also coded (i.e., Integrate supersedes Use Information). The Reasoning strategy includes just one general process of Reason, which we define as providing an explanation or rationale for a claim, as previously described in Knight et al. (2013) , Lawson (2010) , and Toulmin (1958) . The Conclusion strategy includes Eliminate and Claim, processes that provide types of responses to address the final answer. The single process within the Error strategy category, Misinterpret, characterizes steps in which students misunderstand the question stem. Finally, the General category includes the codes Clarify, State the Process, and Restate, all of which are generic statements of execution, representing processes that are domain general ( Alexander and Judy, 1988 ; Prevost and Lemons, 2016 ).

To help visualize the series of steps students took and how these steps differed across answers and content areas, we provide detailed examples in Tables 2 and 3 . In Table 2 , we provide three examples of similar-length documented processes to the same Gel/Pedigree problem ( Figure 1 ) from a correct expert, a correct student, and an incorrect student. Note the multiple uses of planning and reasoning in the expert answer, multiple uses of reasoning in the correct student answer, and the absence of both such processes in the incorrect student answer. The reasoning used in each case provides a logical explanation for the claim, which either immediately precedes or follows the reasoning statement. For example, in the second incident of Claim and Reason for Eliot, “because otherwise Zach could not be unaffected” is a logical explanation for the claim “it has to be dominant.” Similarly, for Cassie’s Claim and Reason code, “If both parents are heterozygous for the disease” is a logical explanation for the claim “it is probably inherited in a dominant manner.” Table 3 provides additional examples of correct student answers to the remaining three content areas. Note that for Probability and Recombination questions, the Reason process often explains why a certain genotype or probability is assigned (e.g., “otherwise all or none of the children would have the disease” explains why “Both parents of H and J must be Dd” in Li’s Probability answer) or how a probability is calculated, for example, “using the multiplication rule” (Li’s Probability explanation) or “multiply that by the 100% chance of getting ‘af’ from parent 2” (Preston’s Recombination explanation). In Nondisjunction problems, a student may claim that a nondisjunction occurred in a certain stage of meiosis (the Claim) because it produces certain gamete genotypes consistent with such an error (the Reason), as seen in Gabrielle’s answer.

aResponses edited slightly for clarity. See Table 2 for a correct student documented solution to the Gel/Pedigree problem.

Across All Content Areas, Expert Answers Are More Likely Than Student Answers to Contain Orientation, Metacognition, and Execution Processes

For each category of answers (expert, correct student, and incorrect student), we calculated the overall percent of answers that contained each process and compared these frequencies. Note that, in all cases, frequency represents the presence of a process in an answer, not a count of all uses of that process in an answer. The raw frequency of each process is provided in Table 4 , columns 2–4. To determine statistical significance, we used GLMM to account for individual variability in process use. The predicted likelihood of each process per group and pairwise comparisons between groups from this analysis is provided in Table 4 , columns 5–10. These comparisons show that expert answers were significantly more likely than student answers to contain the processes of Identify concept, Recall, Plan, Check, and Use Information ( Table 4 and Supplemental Table S1). The answers in Table 2 represent some of the typical trends identified for each group. For example, expert Eliot uses both Plan and Check, but these metacognitive processes are not used by either student, Cassie (correct answer) or Ian (incorrect answer).

aPairwise comparison: incorrect students to correct students (i–c), incorrect students to correct experts (i–e), correct students to correct experts (c–e). NA, no comparison made due to predicted probability of 0 in at least one group. *** p < 0.001; ** p < 0.01; * p < 0.05; ns: p > 0.05. See Supplemental Table S1 for standard error of coefficient estimates. Interpretation example: 82.05% and 92.36% of incorrect and correct student answers, respectively, contained Reason. The GLMM, after accounting for individual variability, predicts the probability of an incorrect student using Reason to be 91.80%, while the probability of a correct student using Reason is 96.68%.

Across All Content Areas, Correct Student Answers Are More Likely Than Incorrect Answers to Contain the Processes of Reason and Eliminate

Students most commonly used the processes Use Information, Reason, and Claim, each present in at least 50% of both correct and incorrect student answers ( Table 4 ). The processes Notice, Recall, Calculate, and Clarify were present in 20–50% of both correct and incorrect student answers ( Table 4 ). In comparing correct and incorrect student answers across all content areas, we found that Integrate, Reason, Eliminate, and Clarify were more likely used in correct compared with incorrect answers ( Table 4 ). As illustrated in Table 2 , the problem-solving processes in Cassie’s correct answer include: reasoning for a claim of dominant inheritance and eliminating when ruling out the possibility of an X-linked mode of inheritance. However, in describing the incorrect answer, Ian fails to document use of either of these processes.

Process Use Varies by Question Content

To determine whether student answers contain different processes depending on the content of the problem, we separated answers, regardless of correctness, by content area. We then excluded some processes: we did not analyze the Error and General codes, as well as Claim, which was seen in virtually every answer across content areas. We also excluded the very rarely seen processes of Identify Similarity and Identify Concept, which were present in 5% or fewer of both incorrect and correct student answers. For the remaining 11 processes, we found that each content area elicited different frequencies of use, as shown in Table 5 and Supplemental Table S2. Some processes were nearly absent in a content area: Calculate was rarely seen in answers to Nondisjunction and Gel/Pedigree questions and Eliminate was rarely seen in answers to Probability and Recombination questions. Furthermore, in answering Probability questions, students were more likely to use the processes Plan and Use Information than in any other content area. Recall was most likely in Recombination and least likely in Gel/Pedigree. Examples of student answers showing some of these trends are shown in Table 3 .

aAll student answers (correct and incorrect) are reported. Processes excluded from analyses include Claim, those within the Error and General strategies, processes that were present in 5% or fewer of both incorrect and correct student answers. Pairwise comparisons between: Probability (P), Recombination (R), Nondisjunction (N), and Gel/Pedigree (G). NA: no comparison made due to prevalence of 0% in at least one group. *** p < 0.001; ** p < 0.01; * p < 0.05; ns: p > 0.05. See Supplemental Table S2 for standard errors of coefficient estimates. Interpretation example: In Probability questions, 94.43% of answers contain Reason, while in Nondisjunction, 84.69% of answers contain Reason. Based on GLMM estimates to account for individual variability in process use, a question in the Probability content area had a 97.52% chance of using Reason, and a question in the Nondisjunction content area had an 92.88% chance of using this process.

The Combination of Processes Linked to Correctness Differs by Content Area

Performance varied by content area. Students performed best on Nondisjunction problems (75% correct), followed by Gel/Pedigree (73%), Probability (54%), and then Recombination (45%). Table 6 shows the raw data of process prevalence for correct and incorrect student answers in each of the four content areas. To examine the combination of problem-solving processes associated with correct student answers for each content area, we used a representative GLMM model with a lasso penalty. This type of analysis measures the predictive value of a process on answer correctness, returning a coefficient value. The presence of a factor with a higher positive coefficient increases the probability of answering correctly more than a factor with a lower positive coefficient. With each additional positive factor in the model, the likelihood of answering correctly increases in an additive manner ( Table 7 and Supplemental Table S3). To interpret these values, we show the probability estimates (%) for each process, which represent the probability that an answer will be correct in the presence of one or more processes ( Table 7 ). The strength of association of the process with correctness, measured by positive coefficient size, is listed in descending order. Thus, for each content area, the process with the strongest positive association to a correct answer is listed first. A process with a negative coefficient (a negative association with correctness) is listed last, and models with negative associations are highlighted in gray in Table 7 . An example of how to interpret the GLMM model is as follows. For the content area of Probability, Calculate (strongest association with correctness), Use Information, and Reason (weakest association with correctness) in combination are positively associated with correctness; Draw is the only negative predictor of correctness. For this content area, the intercept indicates a 7.31% likelihood of answering correctly in the absence of any of the processes tested. If an answer contains Calculate only, there is a 40.19% chance the answer will be correct. If an answer contains both Calculate and Use Information, there is a 58.60% chance the answer will be correct, and if the answer contains the three processes of Calculate, Use Information, and Reason combined, there is a 67.56% chance the answer will be correct. If Draw is present in addition to these three processes, the chance the answer will be correct slightly decreases to 66.40%. For Recombination, the processes of Calculate, Recall, Use Information, Reason, and Plan in combination are associated with correctness, and Draw and Assess Difficulty are negatively associated with correctness. For Nondisjunction, the processes of Eliminate, Draw, and Reason in combination are associated with correctness. For Gel/Pedigree, only the process of Reason was associated with correctness. The examples of correct student answers for each content area, as shown in Tables 2 and 3 , were selected to include each of the positively associated processes described.

aAll student answers (correct and incorrect) are reported. Processes excluded from analyses include Claim, those within the Error and General strategies, processes that were present in 5% or fewer of both correct and incorrect student answers.

aBased on a representative GLMM model with a lasso penalty predicting answer correctness with a moderate penalty parameter (lambda = 25). The intercept represents the likelihood of a correct answer in the absence of all processes initially entered into the model: Notice, Plan, Recall, Check, Assess Difficulty, Use Information, Integrate, Draw, Calculate, Reasoning, Eliminate. Shaded rows indicate the inclusion of negative predictors in combination with positive predictors. Probabilities were calculated using the inverse logit of the sum of the combination of log odds coefficient estimates and the intercept from Supplemental Table S3.

To identify why drawing may be detrimental for Probability and Recombination problems, we further characterized how students described their process of Draw in incorrect answers from these two content areas. We identified two categories: Inaccurate drawing and Inappropriate drawing application. Table 8 provides descriptions and student examples for each category. For Probability problems, 49% of the incorrect student answers that used Draw were Inaccurate, as they identified incorrect genotypes or probabilities while drawing a Punnett square. Thirty-one percent of the answers contained Inappropriate drawing applications such as drawing a Punnett square for each generation of a multiple-generation pedigree rather than multiplying probabilities. Five percent of the answers displayed both Inaccurate and Inappropriate drawing ( Figure 2 ). For Recombination, 83% of incorrect student answers using Draw used an Inappropriate drawing application, typically treating linked genes as if they were unlinked by drawing a Punnett square to calculate probability. Ten percent of answers used both Inappropriate and Inaccurate drawing ( Figure 2 ).

FIGURE 2. Drawing is commonly inaccurate or inappropriate in incorrect student answers for Probability and Recombination. Drawing categorization from student answers that used Draw and answered incorrectly for content areas of (A) Probability ( n = 55) and (B) Recombination ( n = 71). Each category is mutually exclusive, so those that have both Inaccurate drawing/Inappropriate drawing are not in the individual use categories. “No drawing error” indicates neither inaccurate nor inappropriate drawings were described. “Cannot determine” indicates not enough information was provided in the students’ written answer to assign a drawing use category.

In this study, we identified and characterized the various processes that a large sample of students and experts used to document their answers to complex genetics problems. Overall, although their frequency of use differed, experts and students used the same set of problem-solving strategies. Experts were more likely to use orienting and metacognitive strategies than students, confirming prior findings on expert–novice differences (e.g., Chi et al. , 1981 ; Smith and Good, 1984 ; Smith, 1988 ; Atman et al. , 2007 ; Smith et al. , 2013 ; McDonnell and Mullally, 2016 ; Peffer and Ramezani, 2019 ). For students, we also identified which strategies were most associated with correct answers. The use of reasoning was consistently associated with correct answers across all content areas combined as well as for each individual content area. Students used other processes more or less frequently depending on the content of the question, and the combination of processes associated with correct answers also varied by content area.

Domain-Specific Problem Solving

We found that most processes students used (i.e., all but those in the General category) were domain specific, relating directly to genetics content. Prevost and Lemons (2016) , who examined students’ process of solving multiple-choice biology problems, found that domain-general processes were more common in answers to lower-order than higher-order questions. They also found that using more domain-specific processes was associated with correctness. In our study, students solved only higher-order problems that asked them to apply or analyze information. Students also had to construct their responses to each problem, rather than selecting from multiple predetermined answer options. These two factors may explain the prevalence of domain-specific processes in the current study, which allowed us to investigate further the types of domain-specific processes that lead to correct answers.

Metacognitive Activity: Orienting and Metacognitive Processes Are Described by Experts but Not Consistently by Students

Our results support several previous findings from the literature comparing the problem-solving tactics of experts and students: experts are more likely to describe orienting and metacognitive problem-solving strategies than students, including planning solutions, checking work, and identifying the concept of the problem.

While some students used planning in their correct answers, experts solving the same problems were more likely to do so. Prior studies of solutions to complex problems in both engineering and science contexts found that experts more often used the orienting/planning behavior of gathering appropriate information compared with novices ( Atman et al. , 2007 ; Peffer and Ramezani, 2019 ). Experts likely have engaged in authentic scientific investigations of their own, and planning is more likely when the problem to be solved is more complex (e.g., Atman et al. , 2007 ), so experts are likely more familiar with and see value in planning ahead before pursuing a certain problem-solving approach.

Experts were much more likely than students to describe their use of checking work, as also shown in previous work ( Smith and Good, 1984 ; Smith, 1988 ; McDonnell and Mullally, 2016 ). McDonnell and Mullally (2016) found greater levels of unprompted checking after students experienced modeling of explicitly checking prompts and were given points for demonstrating checking. These researchers also noted that when students reviewed their work, they usually only checked some problem components, not all. Incomplete checking was associated with incorrect answers, while complete checking was associated with correct answers. In the current study, we did not assess the completeness of checking, and therefore may have missed an opportunity to correlate checking with correctness. However, if most students were generally checking their answers in a superficial way (i.e., only checking one step in the problem-solving process versus checking all steps), this could explain why there were no differences in the presence of checking between incorrect and correct student answers. In contrast to our study, Prevost and Lemons (2016) found checking was the most common domain-specific procedure used by students when answering both lower- and higher-order multiple-choice biology questions. The multiple-choice format may prompt checking, as the answers have already been provided in the scenario. In addition, while that study assessed answers to graded exam questions, we examined answers to extra-credit assignments. Thus, a lack of motivation may have influenced whether the students in the current study reported checking their answers.

Identifying the Concept of a Problem.

Although this strategy was relatively uncommon even among experts, they were more likely than students to describe identifying the concept of a problem in their solutions. This is consistent with previous research showing that nonexperts use superficial features to solve problems ( Chi et al. , 1981 ; Smith and Good, 1984 ; Smith et al. , 2013 ), a tactic also associated with incorrect solutions ( Smith and Good, 1984 ). The process of identifying relevant core concepts in a problem allows experts to identify the appropriate strategies and knowledge needed for any given problem ( Chi et al. , 1981 ). Thus, we suggest that providing students with opportunities to recognize the core concepts of different problems, and thus the similarity of their solutions, could be beneficial for learning successful problem solving.

Engaging in Explanations: Using Reasoning Is Consistently Associated with Correct Answers

Our findings suggest that, although reasoning is frequently used by both correct and incorrect students, it is strongly associated with correct student answers across all content areas. Correct answers were more likely than incorrect answers to use reasoning; furthermore, reasoning was associated with a correct answer for each of the four content areas we explored. This supports previous work showing that reasoning ability in general is associated with overall biology performance ( Cavallo, 1996 ; Johnson and Lawson, 1998 ). Students who use reasoning may be demonstrating their ability to think logically and sequentially connect ideas, essentially building an argument for why their answers make sense. In fact, teaching the skill of argumentation helps students learn to use evidence to provide a reason for a claim, as well as to rebut others’ claims ( Toulmin, 1958 ; Osborne, 2010 ), and can improve their performance on genetics concepts ( Zohar and Nemet, 2002 ). Thus, the genetics students in the current study who were able to explain the rationale behind each of their problem-solving steps are likely to have built a conceptual understanding of the topic that allowed them to construct logical rationales for their answers.

In the future, think-aloud interviews should be used to more closely examine the types of reasoning students use. Students may be more motivated and better able to explain their rationales verbally, or with a combination of drawn and verbal descriptions, than they are inclined to do when typing their answers in a writing-only situation. Interviewers can also ask follow-up questions, confirming student explanations and ideas, something that cannot be obtained from written explanations. In addition, the problems used in this study were near-transfer problems, similar to those that students previously solved during class. Such problems can often be solved using an algorithmic approach, as also recently described by Frey et al. (2020) in chemistry. Future studies could identify whether and when students use more complex approaches such as causal reasoning (providing connections between ideas) or mechanistic reasoning (explaining the biological mechanism as part of making causal connections ( Russ et al. , 2008 ; Southard et al. , 2016 ) in addition to or instead of algorithmic reasoning.

Students Use Different Processes to Answer Questions in Different Content Areas

Overall, students answered 60% of the questions correctly. Some content areas were more challenging than others: Recombination was the most difficult, followed by Probability, then Gel/Pedigree and Nondisjunction (see also Avena and Knight, 2019 ). While our results do not indicate that a certain combination of processes are both necessary and sufficient to solve a problem correctly, they can be useful to instructors wishing to guide students in their strategy use when considering their solutions to certain types of problems. In the following section, we discuss the processes that were specifically associated with correctness in student answers for each content area.

Probability.

Solving a Probability question requires calculation, while many other types of problems do not. To solve the questions in this study, students needed to consider multiple generations from two families to calculate the likelihood of independent events occurring by using the product rule. Smith (1988) found that both successful and unsuccessful students often find this challenging. Our previous work also found that failing to use the product rule, or using it incorrectly, was the second most common error in incorrect student answers ( Avena and Knight, 2019 ). Correctly solving probability problems likely also requires a conceptual understanding of the reasoning behind each calculation (e.g., Deane et al. , 2016 ). This type of reasoning, specific to the mathematical components of a problem, is referred to as statistical reasoning, a suggested competency for biology students ( American Association for the Advancement of Science, 2011 ). The code of Reason includes reasoning about other aspects of the problem (e.g., determining genotypes; see Table 3 ) in addition to reasoning related to calculations. While reasoning was prevalent in both incorrect and correct answers to Probability problems, using reasoning still provided an additional 9% likelihood of answering correctly for students who had also used calculating and applying information in their answers.

Generally, calculation alone was not sufficient to answer a Probability question correctly. When students applied information to solving the specific problem (captured with the Use Information code), such as determining genotypes within the pedigree or assigning a probability, their likelihood of generating a correct answer was 40%. This only increased to 59% if they also used Calculate (see Table 7 ). We previously found that the most common content error in these types of probability problems was mis-assigning a genotype or probability due to incorrectly using information in the pedigree; this error was commonly seen in combination with a product rule error ( Avena and Knight, 2019 ). This correlates with our current findings on the importance of applying procedural knowledge: both Use Information and Calculate, under the AtL element of generating knowledge, contribute to correct problem-solving.

Recombination.

Both the Probability and Recombination questions are fundamentally about calculating probabilities; thus, not surprisingly, Calculate is also associated with correct answers to Recombination questions. Determining map units and determining the frequency of one possible genotype among possible gametes both require calculation. Use of Recall in addition to Calculate increases the likelihood of answering correctly from 18 to 39%. This may be due to the complexity of some of the terms in these problems. As shown previously, incorrect answers to Recombination questions often fail to use map units in their solution ( Avena and Knight, 2019 ). Appropriately using map units thus likely requires remembering that the map unit designation is representative of the probability of recombination and then applying this definition to the problem. When students Used Information, along with Calculate and Recall, their likelihood of answering correctly increased to 63%.

Reasoning and planning also contribute to correct answers in this content area. In their solutions, students needed to consider the genotypes of the offspring and both parents to solve the problem. The multistep nature of the problem may give students the opportunity to plan their possible approaches, either at the very beginning of the problem and/or as they walk through these steps. This was seen in Preston’s solution ( Table 3 ), in which the student sequentially made a short-term plan and then immediately used information in the problem to carry out that plan.

Drawing: A Potentially Misused Strategy in Probability and Recombination Solutions.

Only one process, Drawing, was negatively associated with correct answers in solutions to both Probability and Recombination questions. Drawing is generally considered beneficial in problem solving across disciplines, as it allows students to generate a representation of the problem space and/or of their thinking (e.g., Mason and Singh, 2010 ; Quillin and Thomas, 2015 ; Heideman et al. , 2017 ). However, when students generate inaccurate drawings or use a drawing methodology inappropriately, they are unlikely to reach a correct answer. In a study examining complex meiosis questions, Kindfield (1993) found that students with more incorrect responses provided drawings with features not necessary to solving the problem. In our current study, we found that the helpfulness of a drawing depends on its quality and the appropriateness or context of its use.

When answering Recombination problems, many students described drawing a Punnett square and then calculating the inheritance as if the linked genes were actually genes on separate chromosomes. In doing so, students revealed a misunderstanding of when and why to appropriately use a Punnett square as well as their lack of understanding that the frequency of recombination is connected to the frequency of gametes. Because we have also shown that planning is beneficial in solving Recombination problems, we suggest that instructors emphasize that students first plan to look for certain characteristics in a problem, such as linked versus unlinked genes, to identify how to proceed. For example, noting that genes are linked would suggest not using a Punnett square when solving the problem. Similarly, in Probability questions, students must realize that uncertainty in genotypes over multiple generations of a family can be resolved by multiplying probabilities together rather than by making multiple possible Punnett squares for the outcome of a single individual. These findings connect to the AtL elements of generative thinking and taking a deep approach: drawing can be a generative behavior, but students must also be thinking about the underlying context of the problem rather than a memorized fact.

Nondisjunction.

In Nondisjunction problems, students were asked to predict the cause of an error in chromosome number. Our model for processes associated with correctness in nondisjunction problems ( Table 7 ) suggested that the likelihood of answering correctly in the absence of several processes was 70%. This may explain the higher percent of correct answers in this content area (75%) compared with other content areas. Nonetheless, three processes were shown to help students answer correctly. The process Eliminate, even though used relatively infrequently (10%), provides a benefit. Using elimination when there are a finite number of obvious solutions is a reasonable strategy, and one previously shown to be successful ( Smith and Good, 1984 ). Ideally, this strategy would be coupled with drawing the steps of meiosis and then reasoning about which separation errors could not explain the answer. Drawing was associated with correct answers in this content area, though it was neither required nor sufficient. Instead of drawing, some students may have used a memorized series of steps in their solutions. This is referred to as an “algorithmic” explanation, in which a memorized pattern is used to solve the problem. For example, such a line of explanation may go as follows: “beginning from a diploid cell heterozygous for a certain gene, two of the same alleles being present in one gamete indicates a nondisjunction in meiosis II.” Such algorithms can be applied without a conceptual understanding ( Jonsson et al. , 2014 ; Nyachwaya et al. , 2014 ), and thus students may inaccurately apply them without fully understanding or being able to visualize what is occurring during a nondisjunction event ( Smith and Good, 1984 ; Nyachwaya et al. , 2014 ). Using a drawing may help provide a basis for analytic reasoning, providing logical links between ideas and claims that are thoughtful and deliberate ( Alter et al. , 2007 ). Indeed, in Kindfield’s study ( 1993 ), in which participants (experts and students) were asked to complete complex meiosis questions, they found that those with more accurate models of meiosis used their drawings to assist in their reasoning process. Kindfield (1993) suggested that these drawings allowed for additional working memory space, thus supporting an accurate problem-solving process.

Gel/Pedigree.

Unlike other content areas, the only process associated with correctness in the Gel/Pedigree model was Reasoning, which provided a greater contribution to correct solutions than in any other content area. In these problems, students are asked to find the most likely mode of inheritance given both a pedigree of a family and a DNA gel that shows representations of alleles for each family member. The two visuals, along with the text of the problem, provide students an opportunity to provide logical explanations at many points in the problem. Students use reasoning to support intermediate claims as they think through possible solutions, and again for their final claims, or for why they eliminate an option. Almost half of both correct and incorrect student answers to these questions integrated features from both the gel and pedigree to answer the problem. Even though many correct and incorrect answers integrate, correct answers also reason. We suggest that the presence of two visual aids prompts students to integrate information from both, thus potentially increasing the likelihood of using reasoning.

Limitations

In this study, we captured the problem-solving processes of a large sample of students by asking them to write their step-by-step processes as part of an online assignment. In so doing, we may not have captured the entirety of a student’s thought process. For example, students may have felt time pressure to complete an assignment, may have experienced fatigue after answering multiple questions on the same topic, or simply may not have documented everything they were thinking. Students may also have been less likely to indicate they were engaging in drawing, as they were answering questions using an online text platform; exploring drawing in more detail in the future would require interviews or the collection of drawings as a component of the problem-solving assignment. Additionally, students may not have felt that all the steps they engaged in were worth explaining in words; this may be particularly true for metacognitive processes. Students are not likely accustomed to expressing their metacognitive processes or admitting uncertainty or confusion during assessment situations. However, even given these limitations, we have captured some of the primary components of student thinking during problem solving.

In addition, our expert–student comparison may be biased, as experts had different reasons than students for participating in the study. The experts likely did so because they wanted to be helpful and found it interesting. Students, on the other hand, had very different motivations, such as using the problems for practice in order to perform well on the next exam and/or to get extra credit. Although it is likely not possible to put experts and students in the same affective state while they are solving problems, it is worth realizing that the frequencies of processes they use could reflect their different states while answering the questions.

Finally, the questions in the assignments provided to students were similar to those seen previously during in-class work. The low prevalence of metacognitive processes in their solutions could be due to students’ perception that they have already solved similar questions. This may prevent them from articulating their plans or from checking their work. More complex, far-transfer problems would likely elicit different patterns of processes for successful problem solving.

SUGGESTIONS FOR INSTRUCTION

Calculating: In questions regarding probability, students will need to be familiar with mathematical representations and calculations. Practicing probabilistic thinking is critical.

Drawing: Capturing thought processes with a drawing can help visualize the problem space and can be used to generate supportive reasoning for one’s thinking (e.g., a drawing of the stages of meiosis). However, a cautionary note: drawing can lead to unsuccessful problem solving when used in an inappropriate context, such as a Punnett square when considering linked genes or using multiple Punnett squares when other rules should be used, such as multiplication of probabilities from multiple generations.

Eliminating: In questions with clear alternate final answers, eliminating answers, preferably while explaining one’s reasons, is particularly useful.

Practicing metacognition: Although there were few significant differences in metacognitive processes between correct and incorrect student answers, we still suggest that planning and checking are valuable across content areas, as demonstrated by the more frequent use of these processes by experts.

In summary, we suggest that instructors not only emphasize key pieces of challenging content for each given topic, but also consistently demonstrate possible problem-solving strategies, provide many opportunities for students to practice thinking about how to solve problems, and encourage students to explain to themselves and others why each of their steps makes sense.

ACKNOWLEDGMENTS

This work was supported by the National Science Foundation (DUE 1711348). We are grateful to Paula Lemons, Stephanie Gardner, and Laura Novick for their guidance and suggestions on this project. Special thanks also to the many students and experts who shared their thinking while solving genetics problems.

  • American Association for the Advancement of Science . ( 2011 ). Vision and change in undergraduate biology education: A call to action . Washington, DC. Google Scholar
  • Bassok, M., & Novick, L. R. ( 2012 ). Problem solving . In Holyoak, K. J.Morrison, R. G. (Eds.), Oxford handbook of thinking and reasoning (pp. 413–432). New York, NY: Oxford University Press. Google Scholar
  • Biggs, J. B. ( 1987 ). Student approaches to learning and studying. Research monograph . Hawthorn, Australia: Australian Council for Educational Research. Google Scholar
  • Bloom, B. S., Krathwohl, D. R., & Masia, B. B. ( 1956 ). Taxonomy of Educational Objectives: The Classification of Educational Goals . New York, NY: David McKay. Google Scholar
  • Gelman, A., & Hill, J. ( 2006 ). Data analysis using regression and multilevel/hierarchical models . Cambridge, England: Cambridge University Press. Google Scholar
  • Groll, A. ( 2017 ). glmmLasso: Variable selection for generalized linear mixed models by L1-penalized estimation . R package version , 1 (1), 25. Google Scholar
  • Kindfield, A. C. H. ( 1993 ). Biology diagrams: Tools to think with . Journal of the Learning Sciences , 3 (1), 1–36. Google Scholar
  • Lemke, J. L. ( 1990 ). Talking science: Language, learning, and values . Norwood, NJ: Ablex Publishing. Retrieved July 30, 2020, from http://eric.ed.gov/?id=ED362379 Google Scholar
  • McDonnell, L., & Mullally, M. ( 2016 ). Teaching students how to check their work while solving problems in genetics . Journal of College Science Teaching , 46 (1), 68. Google Scholar
  • Novick, L. R., & Bassok, M. ( 2005 ). Problem Solving . In Holyoak, K. J.Morrison, R. G. (Eds.), The Cambridge handbook of thinking and reasoning (pp. 321–349). New York. NY: Cambridge University Press. Google Scholar
  • Osborne, J. ( 2010 ). Arguing to learn in science: The role of collaborative, critical discourse . Science , 328 , 463–466. Medline ,  Google Scholar
  • Saldana, J. ( 2015 ). The coding manual for qualitative researchers . Los Angeles, CA: Sage. Google Scholar
  • Schen, M. ( 2012 , March 25). Assessment of argumentation skills through individual written instruments and lab reports in introductory biology . Paper presented at: Annual Meeting of the National Association for Research in Science Teaching (Indianapolis, IN) . Google Scholar
  • Smith, M. K., & Knight, J. K. ( 2012 ). Using the Genetics Concept Assessment to document persistent conceptual difficulties in undergraduate genetics courses . Genetics , 191 , 21–32. Medline ,  Google Scholar
  • Smith, M. K., Wood, W. B., & Knight, J. K. ( 2008 ). The Genetics Concept Assessment: A new concept inventory for gauging student understanding of genetics . CBE—Life Sciences Education , 7 (4), 422–430. Link ,  Google Scholar
  • Toulmin, S. ( 1958 ). The uses of argument . Cambridge: Cambridge University Press. Google Scholar

d solving simple genetic problems

Submitted: 21 January 2021 Revised: 16 July 2021 Accepted: 22 July 2021

© 2021 J. S. Avena et al. CBE—Life Sciences Education © 2021 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

Subscribe to the PwC Newsletter

Join the community, edit social preview.

d solving simple genetic problems

Add a new code entry for this paper

Remove a code repository from this paper, mark the official implementation from paper authors, add a new evaluation result row, remove a task, add a method, remove a method, edit datasets, genetic algorithm for solving simple mathematical equality problem.

Neural and Evolutionary Computing 2017  ·  Denny Hermawanto · Edit social preview

This paper explains genetic algorithm for novice in this field. Basic philosophy of genetic algorithm and its flowchart are described. Step by step numerical computation of genetic algorithm for solving simple mathematical equality problem will be briefly explained

Code Edit Add Remove Mark official

Tasks edit add remove, datasets edit, results from the paper edit add remove, methods edit add remove.

Help | Advanced Search

Quantum Physics

Title: quantum annealers chain strengths: a simple heuristic to set them all.

Abstract: Quantum annealers (QA), such as D-Wave systems, become increasingly efficient and competitive at solving combinatorial optimization problems. However, solving problems that do not directly map the chip topology remains challenging for this type of quantum computer. The creation of logical qubits as sets of interconnected physical qubits overcomes limitations imposed by the sparsity of the chip at the expense of increasing the problem size and adding new parameters to optimize. This paper explores the advantages and drawbacks provided by the structure of the logical qubits and the impact of the rescaling of coupler strength on the minimum spectral gap of Ising models. We show that densely connected logical qubits require a lower chain strength to maintain the ferromagnetic coupling. We also analyze the optimal chain strength variations considering different minor embeddings of the same instance. This experimental study suggests that the chain strength can be optimized for each instance. We design a heuristic that optimizes the chain strength using a very low number of shots during the pre-processing step. This heuristic outperforms the default method used to initialize the chain strength on D-Wave systems, increasing the quality of the best solution by up to 17.2% for tested instances on the max-cut problem.

Submission history

Access paper:.

  • HTML (experimental)
  • Other Formats

license icon

References & Citations

  • INSPIRE HEP
  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

IMAGES

  1. How to solve simple Mendelian genetics problems

    d solving simple genetic problems

  2. Solving Genetics Problems

    d solving simple genetic problems

  3. Copy of Practice

    d solving simple genetic problems

  4. #GeeklyHub How to Solve DNA Numerical Problems

    d solving simple genetic problems

  5. How to solve genetics problems with Punnett square

    d solving simple genetic problems

  6. How to analyze and solve genetics problems

    d solving simple genetic problems

VIDEO

  1. Class 10 ICSE Maths Complete Revision for Boards 2024

  2. An Intro to Solving Linear Equations: Solving some Basic Linear Equations

  3. Genetics| Solving Session (2019-2020 Final) + Notes

  4. 1.1 Solving Simple Equations

  5. Genetic Problems & Steroids #ytshorts #steroids #gym #youtubeshorts

  6. Solving Genetic Questions (3)

COMMENTS

  1. PDF Solutions to Genetics Problems

    struggle with the problem, design your own approach, and make your own mistakes. Only then should you look at the solutions given here. (1) PROBLEMS INVOLVING ONLY ONE GENE (1.1) One gene; two alleles; simple dominance (1.1.1) a) GG GG. A plant homozygous for the G allele is crossed to another plant homozygous for the G allele.

  2. PDF MENDELIAN GENETICS PROBLEMS

    BSC 2012. BSC 2011. MENDELIAN GENETICS PROBLEMS. The following problems are provided to develop your skill and test your understanding of solving problems in the patterns of inheritance. They will be most helpful if you solve them on your own. However, you should seek help if you find you cannot answer a problem.

  3. Genetics Practice Problems

    Level 3 - Hard Mode. Heterozygous means that the individual has two different letters, for example Aa, Bb, Dd.Homozygous means that the individual has two same letters, for example AA, bb, DD, eee. Cats can have a trait where their ear folds down, a breed called the "Scottish Fold," displays this phenotype in most breedings The gene for folded ears is dominant (E) and the gene for straight ...

  4. Simple Genetics Practice Problems

    Fruit Fly Genetics (Vg) - practice worksheet on vestigial wing flies (recessive trait) Explore the Genetics of Corn Snakes - dihybrid crosses with corn snakes, color is polygenic. These simple problems were designed for beginners to genetics, students practice determining whether letter combination represents heterozygous or homozogous alleles.

  5. PDF Penguin Prof Helpful Hints: How to Solve Genetics Problems

    When solving a genetics problem, you are calculating probabilities. The probability of a particular event is the "chance" that event will occur. It's a prediction. Probabilities are expressed as decimals. Probability values range from 0 to 1.0. A probability of 1.0 is a certainty - it's equivalent to a chance of 100%.

  6. PDF Simple Genetics Practice Problems KEY

    Simple Genetics Practice Problems KEY This worksheet will take about 20 minutes for most students, I usually give it to them after a short lecture on solving genetics problems. I don't normally take a grade on it, instead just monitor progress of students as they work and then have them volunteer to write the answers #5-15 on the board. 1.

  7. PDF Simple Genetics Practice Problems

    Name:_____ Simple Genetics Practice Problems 1. For each genotype, indicate whether it is heterozygous (HE) or homozygous (HO)

  8. Simple Genetics Practice Problems

    There are several practice genetics problems in my library, such as Simple Genetics Practice Problems, which you can also assign to those students who are forging ahead. I do not do dihybrid crosses or sex-linked crosses with this class. PDF. DOC. KEY. Grade Level: 7-12. Time Required: Variable, depending on skill level (10 minutes to 30 minutes)

  9. Solving Genetics Problems

    Help with basic genetics problems, including the use of the Punnett square and rules of probability to solve monohybrid, dihybrid and even - wait for it - YE...

  10. PDF Genetics Practice Problems

    Genetics Worksheet 2 3. For each phenotype below, list the genotypes (remember to use the letter of the dominant trait) Straight hair is dominant to curly. Pointed heads are dominant to round heads. ___ straight _____ pointed ____ curly _____ round 4. Set up the Punnet squares for each of the crosses listed below. Round seeds are dominant to ...

  11. PDF Volume 21: Mini Workshops HOW TO SOLVE GENETICS "WORD PROBLEMS"

    (Oxford, Ohio) for a bachelors degree in zoology. She earned a Ph.D. in zoology from The University of Texas at Austin. She teaches cell biology, genetics, microbiology, and non-majors biology at Huston-Tillotson College. Frequently, students who have no trouble solving simple genetics problems, have trouble solving "word problems."

  12. PDF 9-PRACTICE PROBLEMS WITH SOLUTIONS

    Explain. 6. You are a scientist performing the first analysis of the genetic basis for the inheritance of flower color in. a certain species of wildflower. You begin your investigation by observing that there are four different flower colors in the local wild population: white, red, blue and purple.

  13. Punnett squares and probability (practice)

    Learn for free about math, art, computer programming, economics, physics, chemistry, biology, medicine, finance, history, and more. Khan Academy is a nonprofit with the mission of providing a free, world-class education for anyone, anywhere.

  14. Probabilities in genetics (article)

    The Punnett square is a valuable tool, but it's not ideal for every genetics problem. For instance, suppose you were asked to calculate the frequency of the recessive class not for an Aa x Aa cross, not for an AaBb x AaBb cross, but for an AaBbCcDdEe x AaBbCcDdEe cross. If you wanted to solve that question using a Punnett square, you could do it - but you'd need to complete a Punnett square ...

  15. Simple Genetics Practice Problems that Don't Use a Family Tree

    Starting with one trait (monohybrid) and going to two traits (dihybrid) crosses. Mendel bred pea plants and determined dominant and recessive traits based on the ratio of the phenotypes of the offspring. He also went on to discover other inheritance patterns. You can do worksheets and use games to explore simple genetics practice problems.

  16. Monohybrid punnett squares (practice)

    Learn for free about math, art, computer programming, economics, physics, chemistry, biology, medicine, finance, history, and more. Khan Academy is a nonprofit with the mission of providing a free, world-class education for anyone, anywhere.

  17. Problem Solving in Genetics: Content Hints Can Help

    Abstract. Problem solving is an integral part of doing science, yet it is challenging for students in many disciplines to learn. We explored student success in solving genetics problems in several genetics content areas using sets of three consecutive questions for each content area. To promote improvement, we provided students the choice to ...

  18. Exercises: Genetics (Hardison)

    Exercises: Genetics (Hardison) is shared under a not declared license and was authored, remixed, and/or curated by LibreTexts. 13.E: Cancer Genetics (Exercises) These are homework exercises to accompany Hardison's "Working with Molecular Genetics" TextMap. Genetics is the study of genes, genetic variation, and heredity in living organisms.

  19. Genetic Algorithm for Solving Simple Mathematical Equality Problem

    Figure 1. Genetic algorithm flowchart Numerical Example Here are examples of applications that use genetic algorithms to solve the problem of combination. Suppose there is equality a + 2b + 3c + 4d = 30, genetic algorithm will be used to find the value of a, b, c, and d that satisfy the above equation. First we should formulate

  20. Probability and Genetics Practice Problems

    The individual probabilities of the outcome of a cross between pea plants that are heterozygous for one particular trait are given as follows: 1. The probability of the pea plant being tall is 3/4, and that it is short is 1/4. 2. The probability of the pea plant having a round seed is 3/4 and having a wrinkled seed is 1/4.

  21. Problem Solving in Genetics: Content Hints Can Help

    Problem solving is an integral part of doing science, yet it is challenging for students in many disciplines to learn. We explored student success in solving genetics problems in several genetics content areas using sets of three consecutive questions for each content area. To promote improvement, we provided students the choice to take a ...

  22. Successful Problem Solving in Genetics Varies Based on Question Content

    Problem solving is a critical skill in many disciplines but is often a challenge for students to learn. To examine the processes both students and experts undertake to solve constructed-response problems in genetics, we collected the written step-by-step procedures individuals used to solve problems in four different content areas. We developed a set of codes to describe each cognitive and ...

  23. Genetic Algorithm for Solving Simple Mathematical Equality Problem

    This paper explains genetic algorithm for novice in this field. Basic philosophy of genetic algorithm and its flowchart are described. Step by step numerical computation of genetic algorithm for solving simple mathematical equality problem will be briefly explained. PDF Abstract.

  24. Quantum Annealers Chain Strengths: A Simple Heuristic to Set Them All

    Quantum annealers (QA), such as D-Wave systems, become increasingly efficient and competitive at solving combinatorial optimization problems. However, solving problems that do not directly map the chip topology remains challenging for this type of quantum computer. The creation of logical qubits as sets of interconnected physical qubits overcomes limitations imposed by the sparsity of the chip ...