Site search

Math rubrics.

Rubrics are a tool for evaluating student work. There are two main types of rubrics: holistic (a rubric that provides one overall score) and analytic (a rubric that provides scores for different categories). Most of the rubrics in the SFUSD Math Core Curriculum are 4-point holistic rubrics.

A rubric is often used in conjunction with an answer key. The rubric provides a broader picture about a student’s demonstration of understanding the standards and mathematical practices, and the answer key provides specific examples of how a student might answer parts of the task.

Why do I use rubrics?

The primary purpose of a rubric is to provide specific feedback on critical elements of the task and the student work. In addition, rubrics may be used to show students the expectations before they perform a task and to give students feedback and an opportunity for revision after they perform the task. Both of these uses strongly support student learning and achievement.

When do I use rubrics?

Rubrics are traditionally used to evaluate student work after students perform a task, especially a summative task. Rubrics are included for all the Milestone Tasks, as well as some other tasks, in the SFUSD Math Core Curriculum for this purpose. Rubrics can also be used before the task to communicate performance expectations to students and after the task to communicate feedback and provide structure for revision or re-engagement.

How do I use a rubric?

One way to give feedback to students is to make a copy of the rubric for each student, and then highlight or circle the parts of the rubric that apply to the student’s work. For example, you might highlight the first and third paragraph of column 3: Meets standards, and highlight the second paragraph in column 2: Approaching standards. This feedback is useful for students because it gives them an indication of the next steps they could take to improve their work. If you are using a holistic rubric, you will sometimes need to decide whether to give only whole points or to sometimes give half points. For example, if student work shows some elements from a score of 3 but mostly elements from a score of 2, you will need to decide whether to give that work a score of 2.5 or a score of 2.

Using a rubric to assign grades

If you are using a rubric for an Entry, Apprentice, or Expert task, the rubric will be useful for informing your instruction and giving feedback to students, but it should not be used to give grades. Think about whether it is fair to expect mastery of the standards in the unit before you decide to assign grades for student work. ​ If you are using a rubric for a Milestone Task, and you want to use it to assign grades, think about the score descriptors instead of converting the numbers to percents proportionally. For example, you may want to use this guide to convert rubric scores to grades:

Many teachers require that students revise their work when they receive a score of 0, 1, or 2 so that they can show progress toward mastery of the standards.

This page was last updated on June 15, 2023

Berkeley Graduate Division

  • Basics for GSIs
  • Advancing Your Skills

Examples of Rubric Creation

Creating a rubric takes time and requires thought and experimentation. Here you can see the steps used to create two kinds of rubric: one for problems in a physics exam for a small, upper-division physics course, and another for an essay assignment in a large, lower-division sociology course.

Physics Problems

In STEM disciplines (science, technology, engineering, and mathematics), assignments tend to be analytical and problem-based. Holistic rubrics can be an efficient, consistent, and fair way to grade a problem set. An analytical rubric often gives a more clear picture of what a student should direct their future learning efforts on. Since holistic rubrics try to label overall understanding, they can lead to more regrade requests when compared to analytical rubric with more explicit criteria. When starting to grade a problem, it is important to think about the relevant conceptual ingredients in the solution. Then look at a sample of student work to get a feel for student mistakes. Decide what rubric you will use (e.g., holistic or analytic, and how many points). Apply the holistic rubric by marking comments and sorting the students’ assignments into stacks (e.g., five stacks if using a five-point scale). Finally, check the stacks for consistency and mark the scores. The following is a sample homework problem from a UC Berkeley Physics Department undergraduate course in mechanics.

Homework Problem

Learning objective.

Solve for position and speed along a projectile’s trajectory.

Desired Traits: Conceptual Elements Needed for the Solution

  • Decompose motion into vertical and horizontal axes.
  • Identify that the maximum height occurs when the vertical velocity is 0.
  • Apply kinematics equation with g as the acceleration to solve for the time and height.
  • Evaluate the numerical expression.

A note on analytic rubrics: If you decide you feel more comfortable grading with an analytic rubric, you can assign a point value to each concept. The drawback to this method is that it can sometimes unfairly penalize a student who has a good understanding of the problem but makes a lot of minor errors. Because the analytic method tends to have many more parts, the method can take quite a bit more time to apply. In the end, your analytic rubric should give results that agree with the common-sense assessment of how well the student understood the problem. This sense is well captured by the holistic method.

Holistic Rubric

A holistic rubric, closely based on a rubric by Bruce Birkett and Andrew Elby:

[a] This policy especially makes sense on exam problems, for which students are under time pressure and are more likely to make harmless algebraic mistakes. It would also be reasonable to have stricter standards for homework problems.

Analytic Rubric

The following is an analytic rubric that takes the desired traits of the solution and assigns point values to each of the components. Note that the relative point values should reflect the importance in the overall problem. For example, the steps of the problem solving should be worth more than the final numerical value of the solution. This rubric also provides clarity for where students are lacking in their current understanding of the problem.

Try to avoid penalizing multiple times for the same mistake by choosing your evaluation criteria to be related to distinct learning outcomes. In designing your rubric, you can decide how finely to evaluate each component. Having more possible point values on your rubric can give more detailed feedback on a student’s performance, though it typically takes more time for the grader to assess.

Of course, problems can, and often do, feature the use of multiple learning outcomes in tandem. When a mistake could be assigned to multiple criteria, it is advisable to check that the overall problem grade is reasonable with the student’s mastery of the problem. Not having to decide how particular mistakes should be deducted from the analytic rubric is one advantage of the holistic rubric. When designing problems, it can be very beneficial for students not to have problems with several subparts that rely on prior answers. These tend to disproportionately skew the grades of students who miss an ingredient early on. When possible, consider making independent problems for testing different learning outcomes.

Sociology Research Paper

An introductory-level, large-lecture course is a difficult setting for managing a student research assignment. With the assistance of an instructional support team that included a GSI teaching consultant and a UC Berkeley librarian [b] , sociology lecturer Mary Kelsey developed the following assignment:

This was a lengthy and complex assignment worth a substantial portion of the course grade. Since the class was very large, the instructor wanted to minimize the effort it would take her GSIs to grade the papers in a manner consistent with the assignment’s learning objectives. For these reasons Dr. Kelsey and the instructional team gave a lot of forethought to crafting a detailed grading rubric.

Desired Traits

  • Use and interpretation of data
  • Reflection on personal experiences
  • Application of course readings and materials
  • Organization, writing, and mechanics

For this assignment, the instructional team decided to grade each trait individually because there seemed to be too many independent variables to grade holistically. They could have used a five-point scale, a three-point scale, or a descriptive analytic scale. The choice depended on the complexity of the assignment and the kind of information they wanted to convey to students about their work.

Below are three of the analytic rubrics they considered for the Argument trait and a holistic rubric for all the traits together. Lastly you will find the entire analytic rubric, for all five desired traits, that was finally used for the assignment. Which would you choose, and why?

Five-Point Scale

Three-point scale, simplified three-point scale, numbers replaced with descriptive terms.

For some assignments, you may choose to use a holistic rubric, or one scale for the whole assignment. This type of rubric is particularly useful when the variables you want to assess just cannot be usefully separated. We chose not to use a holistic rubric for this assignment because we wanted to be able to grade each trait separately, but we’ve completed a holistic version here for comparative purposes.

Final Analytic Rubric

This is the rubric the instructor finally decided to use. It rates five major traits, each on a five-point scale. This allowed for fine but clear distinctions in evaluating the students’ final papers.

[b] These materials were developed during UC Berkeley’s 2005–2006 Mellon Library/Faculty Fellowship for Undergraduate Research program. M embers of the instructional team who worked with Lecturer Kelsey in developing the grading rubric included Susan H askell-Khan, a GSI Center teaching consultant and doctoral candidate in history, and Sarah McDaniel, a teaching librarian with the Doe/Moffitt Libraries.

  • help_outline help

iRubric: 5 – Point Holistic Math Rubric: Exam

holistic rubric for math problem solving

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Rubrics: Tools for Making Learning Goals and Evaluation Criteria Explicit for Both Teachers and Learners

  • Deborah Allen
  • Kimberly Tanner

*Department of Biological Sciences, University of Delaware, Newark, DE 19716; and

Search for more papers by this author

Department of Biology, San Francisco State University, San Francisco, CA 94132

INTRODUCTION

Introduction of new teaching strategies often expands the expectations for student learning, creating a parallel need to redefine how we collect the evidence that assures both us and our students that these expectations are in fact being met. The default assessment strategy of the typical large, introductory, college-level science course, the multiple- choice (fixed response) exam, when used to best advantage can provide feedback about what students know and recall about key concepts. Leaving aside the difficulty inherent in designing a multiple-choice exam that captures deeper understandings of course material, its limitations become particularly notable when learning objectives include what students are able to do as well as know as the result of time spent in a course. If we want students to build their skill at conducting guided laboratory investigations, developing reasoned arguments, or communicating their ideas, other means of assessment such as papers, demonstrations (the “practical exam”), other demonstrations of problem solving, model building, debates, or oral presentations, to name a few, must be enlisted to serve as benchmarks of progress and/or in the assignment of grades. What happens, however, when students are novices at responding to these performance prompts when they are used in the context of science learning, and faculty are novices at communicating to students what their expectations for a high-level performance are? The more familiar terrain of the multiple-choice exam can lull both students and instructors into a false sense of security about the clarity and objectivity of the evaluation criteria ( Wiggins, 1989 ) and make these other types of assessment strategies seem subjective and unreliable (and sometimes downright unfair) by comparison. In a worst-case scenario, the use of alternatives to the conventional exam to assess student learning can lead students to feel that there is an implicit or hidden curriculum—the private curriculum that seems to exist only in the mind's eye of a course instructor.

Use of rubrics provides one way to address these issues. Rubrics not only can be designed to formulate standards for levels of accomplishment and used to guide and improve performance but also they can be used to make these standards clear and explicit to students. Although the use of rubrics has become common practice in the K–12 setting ( Luft, 1999 ), the good news for those instructors who find the idea attractive is that more and more examples of the use of rubrics are being noted at the college and university level, with a variety of applications ( Ebert-May, undated ; Ebert-May et al ., 1997 ; Wright and Boggs, 2002 ; Moni et al ., 2005 ; Porter, 2005 ; Lynd-Balta, 2006 ).

WHAT IS A RUBRIC?

Although definitions for the word “rubric” abound, for the purposes of this feature article we use the word to denote a type of matrix that provides scaled levels of achievement or understanding for a set of criteria or dimensions of quality for a given type of performance, for example, a paper, an oral presentation, or use of teamwork skills. In this type of rubric, the scaled levels of achievement (gradations of quality) are indexed to a desired or appropriate standard (e.g., to the performance of an expert or to the highest level of accomplishment evidenced by a particular cohort of students). The descriptions of the possible levels of attainment for each of the criteria or dimensions of performance are described fully enough to make them useful for judgment of, or reflection on, progress toward valued objectives ( Huba and Freed, 2000 ).

A good way to think about what distinguishes a rubric from an explanation of an assignment is to compare it with a more common practice. When communicating to students our expectations for writing a lab report, for example, we often start with a list of the qualities of an excellent report to guide their efforts toward successful completion; we may have drawn on our knowledge of how scientists report their findings in peer-reviewed journals to develop the list. This checklist of criteria is easily turned into a scoring sheet (to return with the evaluated assignment) by the addition of checkboxes for indicating either a “yes-no” decision about whether each criterion has been met or the extent to which it has been met. Such a checklist in fact has a number of fundamental features in common with a rubric ( Bresciani et al ., 2004 ), and it is a good starting point for beginning to construct a rubric. Figure 1 gives an example of such a scoring checklist that could be used to judge a high school student poster competition.

Figure 1.

Figure 1. An example of a scoring checklist that could be used to judge a high school student poster competition.

However, what is referred to as a “full rubric” is distinguished from the scoring checklist by its more extensive definition and description of the criteria or dimensions of quality that characterize each level of accomplishment. Table 1 provides one example of a full rubric (of the analytical type, as defined in the paragraph below) that was developed from the checklist in Figure 1 . This example uses the typical grid format in which the performance criteria or dimensions of quality are listed in the rows, and the successive cells across the three columns describe a specific level of performance for each criterion. The full rubric in Table 1 , in contrast to the checklist that only indicates whether a criterion exists ( Figure 1 ), makes it far clearer to a student presenter what the instructor is looking for when evaluating student work.

DESIGNING A RUBRIC

A more challenging aspect of using a rubric can be finding a rubric to use that provides a close enough match to a particular assignment with a specific set of content and process objectives. This challenge is particularly true of so-called analytical rubrics. Analytical rubrics use discrete criteria to set forth more than one measure of the levels of an accomplishment for a particular task, as distinguished from holistic rubrics, which provide more general, uncategorized (“lumped together”) descriptions of overall dimensions of quality for different levels of mastery. Many users of analytical rubrics often resort to developing their own rubric to have the best match between an assignment and its objectives for a particular course.

As an example, examine the two rubrics presented in Tables 2 and 3 , in which Table 2 shows a holistic rubric and Table 3 shows an analytical rubric. These two versions of a rubric were developed to evaluate student essay responses to a particular assessment prompt. In this case the prompt is a challenge in which students are to respond to the statement, “Plants get their food from the soil. What about this statement do you agree with? What about this statement do you disagree with? Support your position with as much detail as possible.” This assessment prompt can serve as both a preassessment, to establish what ideas students bring to the teaching unit, and as a postassessment in conjunction with the study of photosynthesis. As such, the rubric is designed to evaluate student understanding of the process of photosynthesis, the role of soil in plant growth, and the nature of food for plants. The maximum score using either the holistic or the analytical rubric would be 10, with 2 points possible for each of five criteria. The holistic rubric outlines five criteria by which student responses are evaluated, puts a 3-point scale on each of these criteria, and holistically describes what a 0-, 1-, or 2-point answer would contain. However, this holistic rubric stops short of defining in detail the specific concepts that would qualify an answer for 0, 1, or 2 points on each criteria scale. The analytical rubric shown in Table 3 does define these concepts for each criteria, and it is in fact a fuller development of the holistic rubric shown in Table 2 . As mentioned, the development of an analytical rubric is challenging in that it pushes the instructor to define specifically the language and depth of knowledge that students need to demonstrate competency, and it is an attempt to make discrete what is fundamentally a fuzzy, continuous distribution of ways an individual could construct a response. As such, informal analysis of student responses can often play a large role in shaping and revising an analytical rubric, because student answers may hold conceptions and misconceptions that have not been anticipated by the instructor.

The various approaches to constructing rubrics in a sense also can be characterized to be holistic or analytical. Those who offer recommendations about how to build rubrics often approach the task from the perspective of describing the essential features of rubrics ( Huba and Freed, 2000 ; Arter and McTighe, 2001 ), or by outlining a discrete series of steps to follow one by one ( Moskal, 2000 ; Mettler, 2002 ; Bresciani et al ., 2004 ; MacKenzie, 2004 ). Regardless of the recommended approach, there is general agreement that a rubric designer must approach the task with a clear idea of the desired student learning outcomes ( Luft, 1999 ) and, perhaps more importantly, with a clear picture of what meeting each outcome “looks like” ( Luft, 1999 ; Bresciani et al ., 2004 ). If this picture remains fuzzy, perhaps the outcome is not observable or measurable and thus not “rubric-worthy.”

Reflection on one's particular answer to two critical questions—“What do I want students to know and be able to do?” and “How will I know when they know it and can do it well?”—is not only essential to beginning construction of a rubric but also can help confirm the choice of a particular assessment task as being the best way to collect evidence about how the outcomes have been met. A first step in designing a rubric, the development of a list of qualities that the learner should demonstrate proficiency in by completing an assessment task, naturally flows from this prior rumination on outcomes and on ways of collecting evidence that students have met the outcome goal. A good way to get started with compiling this list is to view existing rubrics for a similar task, even if this rubric was designed for younger or older learners or for different subject areas. For example, if one sets out to develop a rubric for a class presentation, it is helpful to review the criteria used in a rubric for oral communication in a graduate program (organization, style, use of communication aids, depth and accuracy of content, use of language, personal appearance, responsiveness to audience; Huba and Freed, 2000 ) to stimulate reflection on and analysis of what criteria (dimensions of quality) align with one's own desired learning outcomes. There is technically no limit to the number of criteria that can be included in a rubric, other than presumptions about the learners' ability to digest and thus make use of the information that is provided. In the example in Table 1 , only three criteria were used, as judged appropriate for the desired outcomes of the high school poster competition.

After this list of criteria is honed and pruned, the dimensions of quality and proficiency will need to be separately described (as in Table 1 ), and not just listed. The extent and nature of this commentary depends upon the type of rubric—analytical or holistic. This task of expanding the criteria is an inherently difficult task, because of the requirement for a thorough familiarity with both the elements comprising the highest standard of performance for the chosen task, and the range of capabilities of learners at a particular developmental level. A good way to get started is to think about how the attributes of a truly superb performance could be characterized in each of the important dimensions—the level of work that is desired for students to aspire to. Common advice ( Moskal, 2000 ) is to avoid use of words that connote value judgments in these commentaries, such as “creative” or “good” (as in “the use of scientific terminology language is ‘good’”). These terms are essentially so general as to be valueless in terms of their ability to guide a learner to emulate specific standards for a task, and although it is admittedly difficult, they need to be defined in a rubric. Again, perusal of existing examples is a good way to get started with writing the full descriptions of criteria. Fortunately, there are a number of data banks that can be searched for rubric templates of virtually all types ( Chicago Public Schools, 2000 ; Arter and McTighe, 2001 ; Shrock, 2006 ; Advanced Learning Technologies, 2006 ; University of Wisconsin-Stout, 2006 ).

Scale 1: Exemplary, Proficient, Acceptable, Unacceptable

Scale 2: Substantially Developed, Mostly Developed, Developed, Underdeveloped

Scale 3: Distinguished, Proficient, Apprentice, Novice

Scale 4: Exemplary, Accomplished, Developing, Beginning

Huba and Freed (2000) offer the interesting recommendation that the descriptions for each level of performance provide a “real world” connection by stating the implications for accomplishment at that level. This description of the consequences could be included in a criterion called “professionalism.” For example, in a rubric for writing a lab report, at the highest level of mastery the rubric could state, “this report of your study would persuade your peers of the validity of your findings and would be publishable in a peer-reviewed journal.” Acknowledging this recommendation in the construction of a rubric might help to steer students toward the perception that the rubric represents the standards of a profession, and away from the perception that a rubric is just another way to give a particular teacher what he or she wants ( Andrade and Du, 2005 ).

As a further help aide for beginning instructors, a number of Web sites, both commercial and open access, have tools for online construction of rubrics from templates, for example, Rubistar ( Advanced Learning Technologies, 2006 ) and TeAch-nology ( TeAch-nology, undated ). These tools allow the would-be “rubrician” to select from among the various types of rubrics, criteria, and rating scales (levels of mastery). Once these choices are made, editable descriptions fall into place in the proper cells in the rubric grid. The rubrics are stored in the site databases, but typically they can be downloaded using conventional word processing or spreadsheet software. Further editing can result in a rubric uniquely suitable for your teaching/learning goals.

ANALYZING AND REPORTING INFORMATION GATHERED FROM A RUBRIC

Whether used with students to set learning goals, as scoring devices for grading purposes, to give formative feedback to students about their progress toward important course outcomes, or for assessment of curricular and course innovations, rubrics allow for both quantitative and qualitative analysis of student performance. Qualitative analysis could yield narrative accounts of where students in general fell in the cells of the rubric, and they can provide interpretations, conclusions, and recommendations related to student learning and development. For quantitative analysis the various levels of mastery can be assigned different numerical scores to yield quantitative rankings, as has been done for the sample rubric in Table 1 . If desired, the criteria can be given different scoring weightings (again, as in the poster presentation rubric in Table 1 ) if they are not considered to have equal priority as outcomes for a particular purpose. The total scores given to each example of student work on the basis of the rubric can be converted to a grading scale. Overall performance of the class could be analyzed for each of the criteria competencies.

Multiple-choice exams have the advantage that they can be computer or machine scored, allowing for analysis and storage of more specific information about different content understandings (particularly misconceptions) for each item, and for large numbers of students. The standard rubric-referenced assessment is not designed to easily provide this type of analysis about specific details of content understanding; for the types of tasks for which rubrics are designed, content understanding is typically displayed by some form of narrative, free-choice expression. To try to capture both the benefits of the free-choice narrative and generate an in-depth analysis of students' content understanding, particularly for large numbers of students, a special type of rubric, called the double-digit, is typically used. A large-scale example of use of this type of scoring rubric is given by the Trends in International Mathematics and Science Study (1999) . In this study, double-digit rubrics were used to code and analyze student responses to short essay prompts.

To better understand how and why these rubrics are constructed and used, refer to the example provided in Figure 2 . This double-digit rubric was used to score and analyze student responses to an essay prompt about ecosystems that was accompanied by the standard “sun-tree-bird” diagram (a drawing of the sun, a tree, and other plants; various primary and secondary consumers; and some not well-identifiable decomposers, with interconnecting arrows that could be interpreted as energy flow or cycling of matter). A brief narrative, summarizing the “big ideas” that could be included in a complete response, along with a sample response that captures many of these big ideas accompanies the actual rubric. The rubric itself specifies major categories of student responses, from complete to various levels of incompleteness. Each level is assigned one of the first digits of the scoring code, which could actually correspond to a conventional point total awarded for a particular response. In the example in Figure 2 , a complete response is awarded a maximum number of 4 points, and the levels of partially complete answers, successively lower points. Here, the “incomplete” and “no response” categories are assigned first digits of 7 and 9, respectively, rather than 0 for clarity in coding; they can be converted to zeroes for averaging and reporting of scores.

Figure 2.

Figure 2. A double-digit rubric used to score and analyze student responses to an essay prompt about ecosystems.

The second digit is assigned to types of student responses in each category, including the common approaches and misconceptions. For example, code 31 under the first partial- response category denotes a student response that “talks about energy flow and matter cycling, but does not mention loss of energy from the system in the form of heat.” The sample double-digit rubric in Figure 2 shows the code numbers that were assigned after a “first pass” through a relatively small number of sample responses. Additional codes were later assigned as more responses were reviewed and the full variety of student responses revealed. In both cases, the second digit of 9 was reserved for a general description that could be assigned to a response that might be unique to one or only a few students but nevertheless belonged in a particular category. When refined by several assessments of student work by a number of reviewers, this type of rubric can provide a means for a very specific quantitative and qualitative understanding, analysis, and reporting of the trends in student understanding of important concepts. A high number of 31 scores, for example, could provide a major clue about deficiencies in past instruction and thus goals for future efforts. However, this type of analysis remains expensive, in that scores must be assigned and entered into a data base, rather than the simple collection of student responses possible with a multiple-choice test.

WHY USE RUBRICS?

When used as teaching tools, rubrics not only make the instructor's standards and resulting grading explicit, but they can give students a clear sense of what the expectations are for a high level of performance on a given assignment, and how they can be met. This use of rubrics can be most important when the students are novices with respect to a particular task or type of expression ( Bresciani et al ., 2004 ).

From the instructor's perspective, although the time expended in developing a rubric can be considerable, once rubrics are in place they can streamline the grading process. The more specific the rubric, the less the requirement for spontaneous written feedback for each piece of student work—the type that is usually used to explain and justify the grade. Although provided with fewer written comments that are individualized for their work, students nevertheless receive informative feedback. When information from rubrics is analyzed, a detailed record of students' progress toward meeting desired outcomes can be monitored and then provided to students so that they may also chart their own progress and improvement. With team-taught courses or multiple sections of the same course, rubrics can be used to make faculty standards explicit to one another, and to calibrate subsequent expectations. Good rubrics can be critically important when student work in a large class is being graded by teaching assistants.

Finally, by their very nature, rubrics encourage reflective practice on the part of both students and teachers. In particular, the act of developing a rubric, whether or not it is subsequently used, instigates a powerful consideration of one's values and expectations for student learning, and the extent to which these expectations are reflected in actual classroom practices. If rubrics are used in the context of students' peer review of their own work or that of others, or if students are involved in the process of developing the rubric, these processes can spur the development of their ability to become self-directed and help them develop insight into how they and others learn ( Luft, 1999 ).

ACKNOWLEDGMENTS

We gratefully acknowledge the contribution of Richard Donham (Mathematics and Science Education Resource Center, University of Delaware) for development of the double-digit rubric in Figure 2 .

  • Advanced Learning Technologies, University of Kansas ( 2006 ). Rubistar 28 May 2006 http://rubistar.4teachers.org/index.php . Google Scholar
  • Andrade H., Du Y. ( 2005 ). Student perspectives on rubric-referenced assessment . Pract. Assess. Res. Eval 18 May 2006 10 http://pareonline.net/pdf/v10n3.pdf . Google Scholar
  • Arter J. A., McTighe J. ( 2001 ). Scoring Rubrics in the Classroom: Using Performance Criteria for Assessing and Improving Student Performance , Thousand Oaks, CA: Corwin Press. Google Scholar
  • Bresciani M. J., Zelna C. L., Anderson J. A. ( 2004 ). Criteria and rubrics In: Assessing Student Learning and Development: A Handbook for Practitioners , Washington, DC: National Association of Student Personnel Administrators, 29-37. Google Scholar
  • Chicago Public Schools ( 2000 ). The Rubric Bank 18 May 2006 http://intranet.cps.k12.il.us/Assessments/Ideas_and_Rubrics/Rubric_Bank/rubric_bank.html . Google Scholar
  • Ebert-May D., Brewer C., Allred S. ( 1997 ). Innovation in large lectures—teaching for active learning . Bioscience 47 , 601-607. Google Scholar
  • Ebert-May D. Scoring Rubrics. Field-tested Learning Assessment Guide 18 May 2006 undated http://www.wcer.wisc.edu/archive/cl1/flag/cat/catframe.htm . Google Scholar
  • Huba M. E., Freed J. E. ( 2000 ). Using rubrics to provide feedback to students In: Learner-Centered Assessment on College Campuses , Boston: Allyn and Bacon, 151-200. Google Scholar
  • Luft J. A. ( 1999 ). Rubrics: design and use in science teacher education . J. Sci. Teach. Educ 10 , 107-121. Google Scholar
  • Lynd-Balta E. ( 2006 ). Using literature and innovative assessments to ignite interest and cultivate critical thinking skills in an undergraduate neuroscience course . CBE Life Sci. Educ 5 , 167-174. Link ,  Google Scholar
  • MacKenzie W. ( 2004 ). Constructing a rubric In: NETS●S Curriculum Series: Social Studies Units for Grades 9–12 , Washington, DC: International Society for Technology in Education, 24-30. Google Scholar
  • Mettler C. A. ( 2002 , Ed. C. Boston , Designing scoring rubrics for your classroom In: Understanding Scoring Rubrics: A Guide for Teachers , University of Maryland, College Park, MD: ERIC Clearinghouse on Assessment and Evaluation, 72-81. Google Scholar
  • Moni R., Beswick W., Moni K. B. ( 2005 ). Using student feedback to construct an assessment rubric for a concept map in physiology . Adv. Physiol. Educ 29 , 197-203. Medline ,  Google Scholar
  • Moskal B. M. ( 2000 ). Scoring Rubrics Part II: How? ERIC/AE Digest, ERIC Clearinghouse on Assessment and Evaluation. Eric Identifier #ED446111 21 April 2006 http://www.eric.ed.gov . Google Scholar
  • Porter J. R. ( 2005 ). Information literacy in biology education: an example from an advanced cell biology course . Cell Biol. Educ 4 , 335-343. Link ,  Google Scholar
  • Shrock K. ( 2006 ). Kathy Shrock's Guide for Educators 5 June 2006 http://school.discovery.com/schrockguide/assess.html#rubrics . Google Scholar
  • TeAch-nology, Inc TeAch-nology 7 June 2006 undated http://teach-nology.com/web_tools/rubrics . Google Scholar
  • Trends in International Mathematics and Science Study ( 1999 ). Science Benchmarking Report, 8th Grade, Appendix A: TIMSS Design and Procedures 9 June 2006 http://timss.bc.edu/timss1999b/sciencebench_report/t99bscience_A.html . Google Scholar
  • University of Wisconsin–Stout ( 2006 ). Teacher Created Rubrics for Assessment 7 June 2006 http://www.uwstout.edu/soe/profdev/rubrics.shtml . Google Scholar
  • Wiggins G. ( 1989 ). A true test: toward more authentic and equitable assessment . Phi Delta Kappan 49 , 703-713. Google Scholar
  • Wright R., Boggs J. ( 2002 ). Learning cell biology as a team: a project-based approach to upper-division cell biology . Cell Biol. Educ 1 , 145-153. Link ,  Google Scholar
  • Stephanie M. Gardner ,
  • Aakanksha Angra , and
  • Joseph A. Harsh
  • Kristy J Wilson, Monitoring Editor
  • The Use of Scoring Rubrics in University 2 January 2024 | Acta Pedagogia Asiana, Vol. 3, No. 1
  • Architecture Education: Rubrics in Google Classroom as a Tool of Improving the Assessment and Learning 1 January 2024
  • Developing an Evaluation Rubric for Planning and Assessing SSI-Based STEAM Programs in Science Classrooms 25 August 2023 | Research in Science Education, Vol. 53, No. 6
  • Democratising assessment rubrics for international students 16 November 2023 | Assessment & Evaluation in Higher Education, Vol. 7
  • Developing accounting students’ professional competencies and satisfaction through learning experiences: Validation of a self-administered questionnaire The International Journal of Management Education, Vol. 21, No. 3
  • Yabancı dil olarak Türkçe öğretiminde konuşma becerisine yönelik biçimlendirici değerlendirme temelli dereceli puanlama anahtarı geliştirme çalışması 21 October 2023 | RumeliDE Dil ve Edebiyat Araştırmaları Dergisi, No. 36
  • Use of a short, in-class drawing activity to assess student understanding of core concepts of the cell membrane in an undergraduate physiology course Advances in Physiology Education, Vol. 47, No. 3
  • Lara K. Goudsouzian and
  • Jeremy L. Hsu
  • Stephanie Gardner, Monitoring Editor
  • Methods to improve grading reliability in multi-section undergraduate courses 22 August 2023 | Journal of Biological Education, Vol. 1
  • Integrating evidence-based teaching practices into the Mammalogy classroom 26 February 2023 | Journal of Mammalogy, Vol. 104, No. 4
  • standard measurement in online learning: a rubric as a focus on teaching-learning practices to move up quality education 5 September 2023 | EIKI Journal of Effective Teaching Methods, Vol. 1, No. 3
  • Co-Production of Assessment Rubrics in an Online Education Context
  • Knowledge of Language in Rubric Design
  • Using Rubrics as Feedforward Tools for Subject Contextualized Dialogue
  • Rubrik ile Ödev ve Performans Değerlendirme: Sürekli İyileştirme Örneği 20 April 2023 | Sağlık Bilimleri Üniversitesi Hemşirelik Dergisi, Vol. 5, No. 1
  • BPMN4MOOC: A BPMN extension for the design of connectivist MOOC 6 April 2023 | Interactive Learning Environments, Vol. 20
  • When a machine detects student reasoning: a review of machine learning-based formative assessment of mechanistic reasoning 1 January 2023 | Chemistry Education Research and Practice, Vol. 24, No. 2
  • A Case Study of a Multi-Faceted Approach to Evaluating Teacher Candidate Ratings 27 July 2022 | The Teacher Educator, Vol. 58, No. 2
  • Noelle Clark and
  • Kristy Jean Wilson, Monitoring Editor
  • Designing and validating an assessment rubric for writing emails in English as a foreign language 27 March 2023 | Research in Subject-matter Teaching and Learning (RISTAL), Vol. 6, No. 1
  • Rúbrica basada en competencias de aprendizaje en un curso CS1 para evaluar actividades de programación CSCL 2 January 2023 | Revista Científica, Vol. 46, No. 1
  • Faculty experience and reliability of assessing narrative reports using rubrics: Report from a dental school in India Journal of Indian Association of Public Health Dentistry, Vol. 21, No. 1
  • Evidence-Based Practice (EBP) evaluation rubric for MSN students 1 January 2023 | i-manager's Journal on Nursing, Vol. 13, No. 3
  • Learning Assessment Tools: Which One to Use? 1 January 2024
  • An Outcomes-Based Framework for Integrating Academics With Student Life to Promote the Development of Undergraduate Students’ Non-cognitive Skills 3 March 2022 | Frontiers in Education, Vol. 7
  • Automated Code Assessment for Education: Review, Classification and Perspectives on Techniques and Tools 8 February 2022 | Software, Vol. 1, No. 1
  • Semester-Long Projects in the Analytical Chemistry Laboratory Curriculum 31 January 2022
  • Do Explicit Course-Level Learning Objectives Affect Students’ Course Perceptions and Ability to Recall Factual Knowledge and Analyze Political Problems? 6 December 2021 | Journal of Political Science Education, Vol. 18, No. 1
  • Using Team-Based Scenario Learning (TBSL) Approach to Teach Audit Risk 10 December 2022
  • Case Study: How Langston University’s LINC Program Contributed to Diversity in the Next Generation of Chemists, Medical Personnel and other Highly Trained STEM Professionals 1 January 2022 | Current Research in Materials Chemistry, Vol. 4, No. 1
  • Impact of a Pediatric-Focused Communication Course on Patient/Caregiver-Perceived Physician Communication Skills in a Pediatric Emergency Department 17 December 2019 | Pediatric Emergency Care, Vol. 37, No. 12
  • Formative Assessment of Social-Emotional Skills Using Rubrics: A Review of Knowns and Unknowns 17 November 2021 | Frontiers in Education, Vol. 6
  • Rubric formats for the formative assessment of oral presentation skills acquisition in secondary education 20 July 2021 | Educational Technology Research and Development, Vol. 69, No. 5
  • Advancing English Language Learners' Speaking Skills Using VoiceThread in Mobile Learning for Russian Tertiary Context International Journal of Web-Based Learning and Teaching Technologies, Vol. 16, No. 6
  • Preservice teachers’ enactment of formative assessment using rubrics in the inquiry-based chemistry laboratory 1 January 2021 | Chemistry Education Research and Practice, Vol. 22, No. 4
  • Cumulative oral examinations in undergraduate human physiology: process, student perceptions, and outcomes Advances in Physiology Education, Vol. 45, No. 3
  • Effects of rubric quality on marker variation in higher education Studies in Educational Evaluation, Vol. 70
  • Active Learning: Basic Science Workshops, Clinical Science Cases, and Medical Role-Playing in an Undergraduate Biology Course 21 July 2021 | Education Sciences, Vol. 11, No. 8
  • Gathering evidence on e-rubrics: Perspectives and many facet Rasch analysis of rating behavior 10 June 2021 | International Journal of Assessment Tools in Education, Vol. 8, No. 2
  • Exploring differences between international business undergraduates’ conceptual understanding 7 October 2019 | Studies in Higher Education, Vol. 46, No. 6
  • Alternative Assessment to Lab Reports: A Phenomenology Study of Undergraduate Biochemistry Students’ Perceptions of Interview Assessment 20 April 2021 | Journal of Chemical Education, Vol. 98, No. 5
  • Students’ Perceptions of Instructional Rubrics in Neurological Physical Therapy and Their Effects on Students’ Engagement and Course Satisfaction 6 May 2021 | International Journal of Environmental Research and Public Health, Vol. 18, No. 9
  • Çevrimiçi Öz Ve Akran Geribildirimlerin Biçimlendirici Niteliğini Belirlemeye Yönelik Analitik Rubrik Geliştirme Çalışması 16 April 2021 | Uşak Üniversitesi Eğitim Araştırmaları Dergisi, Vol. 7, No. 1
  • Comparison of Machine Learning Performance Using Analytic and Holistic Coding Approaches Across Constructed Response Assessments Aligned to a Science Learning Progression 26 September 2020 | Journal of Science Education and Technology, Vol. 30, No. 2
  • Development and Validation of the Pediatric Physician Interpersonal Communication Skills Assessment of Emergency Physicians by Pediatric Patients and Their Caregivers 21 May 2020 | AEM Education and Training, Vol. 5, No. 2
  • Fostering equity, diversity, and inclusion in large, first‐year classes: Using reflective practice questions to promote universal design for learning in ecology and evolution lessons 24 November 2020 | Ecology and Evolution, Vol. 11, No. 8
  • Cognitive Evaluation of Machine Learning Agents Cognitive Systems Research, Vol. 66
  • Developing and Implementing a Specifications Grading System in an Organic Chemistry Laboratory Course 3 December 2020 | Journal of Chemical Education, Vol. 98, No. 2
  • Development and psychometric properties of rubrics for assessing social-emotional skills in youth Studies in Educational Evaluation, Vol. 67
  • Design-based learning for a sustainable future: student outcomes resulting from a biomimicry curriculum in an evolution course 20 October 2020 | Evolution: Education and Outreach, Vol. 13, No. 1
  • MATLAB-based project assessment in process modelling unit: A case study from Swinburne University of Technology Sarawak Campus Education for Chemical Engineers, Vol. 33
  • Mini-Review - Teaching Writing in the Undergraduate Neuroscience Curriculum: Its Importance and Best Practices Neuroscience Letters, Vol. 737
  • Laurence Clement ,
  • Jennie B. Dorman , and
  • Richard McGee
  • Adele Wolfson, Monitoring Editor
  • Investigating reliability and validity of student performance assessment in Higher Education using Rasch Model Journal of Physics: Conference Series, Vol. 1529, No. 4
  • Developing the Physics Teacher Education Program Analysis rubric: Measuring features of thriving programs 3 April 2020 | Physical Review Physics Education Research, Vol. 16, No. 1
  • Development of a Tool to Assess Trainees’ Ability to Design and Conduct Quality Improvement Projects 12 June 2019 | American Journal of Medical Quality, Vol. 35, No. 2
  • Student Motivation from and Resistance to Active Learning Rooted in Essential Science Practices 23 December 2017 | Research in Science Education, Vol. 50, No. 1
  • Beyond Fairness and Consistency in Grading: The Role of Rubrics in Higher Education 7 January 2020
  • The Supervisory Role of Life Science Research Faculty: The Missing Link to Diversifying the Academic Workforce? Journal of Microbiology & Biology Education, Vol. 21, No. 1
  • Enhancing Critical Thinking in Engineering by Incorporating an E-assignment and Peer Review in a Blended Learning Course
  • Using a business process management system to model dynamic teaching methods The Journal of Strategic Information Systems, Vol. 28, No. 3
  • In Defense of Food Curriculum: A Mixed Methods Outcome Evaluation in Afterschool 4 March 2019 | Health Education & Behavior, Vol. 46, No. 4
  • Teaching Technical Communication to Engineering Students at Scale
  • An experimental test: Using rubrics for reflective writing to develop reflection Studies in Educational Evaluation, Vol. 61
  • MOOCAT: A visual authoring tool in the cMOOC context 8 October 2018 | Education and Information Technologies, Vol. 24, No. 2
  • Psychometric Evaluation of a Rubric to Assess Basic Performance During Simulation in Nursing Nursing Education Perspectives, Vol. 40, No. 2
  • Developing 21st century student research skills through assessment matrix and edmodo in biology project 12 March 2019 | Journal of Physics: Conference Series, Vol. 1157
  • Training graduate teaching assistants in the geosciences: Our practices vs. perceived needs 17 January 2019 | Journal of Geoscience Education, Vol. 67, No. 1
  • Testing a Communication Assessment Tool for Ethically Sensitive Scenarios: Protocol of a Validation Study 8 May 2019 | JMIR Research Protocols, Vol. 8, No. 5
  • A Pedagogical Approach Towards Curating Mobile Apps in an Educational Context
  • Best Practices in Summative Assessment 5 December 2019
  • Skills and Foundational Concepts for Biochemistry Students 5 December 2019
  • Aakanksha Angra and
  • Stephanie M. Gardner
  • Jennifer Knight, Monitoring Editor
  • La rúbrica en el examen oral de Traumatología y Ortopedia Educación Médica, Vol. 19
  • Exploring mentors' interpretation of terminology and levels of competence when assessing nursing students: An integrative review Nurse Education Today, Vol. 69
  • Rubric system for evaluation of crown preparation performed by dental students 1 March 2018 | European Journal of Dental Education, Vol. 22, No. 3
  • Cynthia F. C. Hill ,
  • Julia S. Gouvea , and
  • David Hammer
  • Elisabeth Schussler, Monitoring Editor
  • Rubrics in program evaluation 27 March 2018 | Evaluation Journal of Australasia, Vol. 18, No. 1
  • Biobehavioral Insights into Adaptive Behavior in Complex and Dynamic Operational Settings: Lessons learned from the Soldier Performance and Effective, Adaptable Response Task 5 February 2018 | Frontiers in Medicine, Vol. 4
  • The Use of Rubrics to Improve Integration and Engagement Between Biosecurity Agencies and Their Key Partners and Stakeholders: A Surveillance Example 25 May 2018
  • Values in evaluation – The use of rubrics Evaluation and Program Planning, Vol. 65
  • Designing and creating an educational app rubric for preschool teachers 30 January 2017 | Education and Information Technologies, Vol. 22, No. 6
  • On faculty development of STEM inclusive teaching practices 22 August 2017 | FEMS Microbiology Letters, Vol. 364, No. 18
  • High Motivation and Relevant Scientific Competencies Through the Introduction of Citizen Science at Secondary Schools
  • Benefits and Challenges of Developing a Customized Rubric for Curricular Review of a Residency Program in Laboratory Animal Medicine Journal of Veterinary Medical Education, Vol. 44, No. 3
  • Revealing conceptual understanding of international business 9 May 2016 | Assessment & Evaluation in Higher Education, Vol. 42, No. 5
  • Self-Observation and Peer Feedback as a Faculty Development Approach for Problem-Based Learning Tutors: A Program Evaluation 2 March 2017 | Teaching and Learning in Medicine, Vol. 29, No. 3
  • Teaching Tip: Improving Students' Email Communication through an Integrated Writing Assignment in a Third-Year Toxicology Course Journal of Veterinary Medical Education, Vol. 44, No. 2
  • Best practices in summative assessment Advances in Physiology Education, Vol. 41, No. 1
  • Developing and Supporting Students’ Autonomy To Plan, Perform, and Interpret Inquiry-Based Biochemistry Experiments 7 November 2016 | Journal of Chemical Education, Vol. 94, No. 1
  • Embedding ePortfolios in a Postgraduate Medical Sonography Program 15 September 2016
  • Developing Communication Management Skills 23 September 2016 | Business and Professional Communication Quarterly, Vol. 79, No. 4
  • Rubrics as a Tool in Writing Instruction: Effects on the Opinion Essays of First and Second Graders 5 September 2015 | Early Childhood Education Journal, Vol. 44, No. 5
  • Defining Conceptual Understanding for Teaching in International Business 14 October 2016 | Journal of Teaching in International Business, Vol. 27, No. 2-3
  • Kristin M. Bass ,
  • Dina Drits-Esser , and
  • Louisa A. Stark
  • Ross Nehm, Monitoring Editor
  • University Students’ Perceptions of E-Portfolios and Rubrics as Combined Assessment Tools in Education Courses 5 November 2015 | Journal of Educational Computing Research, Vol. 54, No. 1
  • Contemporary Educational Psychology, Vol. 44-45
  • Comparing the Impact of Course-Based and Apprentice-Based Research Experiences in a Life Science Laboratory Curriculum Journal of Microbiology & Biology Education, Vol. 16, No. 2
  • On being examined: do students and faculty agree? Advances in Physiology Education, Vol. 39, No. 4
  • Information Literacy and Adult Learners 1 July 2015 | Adult Learning, Vol. 26, No. 4
  • Reconsidering the Use of Scoring Rubrics in Biology Instruction The American Biology Teacher, Vol. 77, No. 9
  • Lecturers’ perceptions: The value of assessment rubrics for informing teaching practice and curriculum review and development 20 December 2015 | Africa Education Review, Vol. 12, No. 3
  • Teaching and Learning Strategies 14 September 2019 | , Vol. 14
  • Sarah Miller , and
  • Kimberly D. Tanner
  • , Monitoring Editor
  • Backward by Design: Building ELSI into a Stem Cell Science Curriculum 5 May 2015 | Hastings Center Report, Vol. 45, No. 3
  • Brian A. Couch ,
  • Tanya L. Brown ,
  • Tyler J. Schelpat ,
  • Mark J. Graham , and
  • Jennifer K. Knight
  • Michèle Shuster, Monitoring Editor
  • Closing the Circle: Use of Students’ Responses for Peer-Assessment Rubric Improvement 7 November 2015
  • Assuring the quality of standards-oriented classroom assessment with rubrics for complex competencies Studies in Educational Evaluation, Vol. 43
  • Assessing English speaking skills of prospective teachers at entry and graduation level in teacher education program 8 May 2014 | Language Testing in Asia, Vol. 4, No. 1
  • Application of New Assessment Tools in Engineering Studies: The Rubric IEEE Revista Iberoamericana de Tecnologias del Aprendizaje, Vol. 9, No. 4
  • An analysis of the effectiveness of feedback to students on assessed work 19 December 2013 | Higher Education Research & Development, Vol. 33, No. 4
  • Grading rubrics: hoopla or help? 16 April 2013 | Innovations in Education and Teaching International, Vol. 51, No. 4
  • Jeffrey Schinske , and
  • El uso de e-rúbricas para la evaluación de competencias en estudiantes universitarios. Estudio sobre fiabilidad del instrumento. 10 May 2014 | REDU. Revista de Docencia Universitaria, Vol. 12, No. 1
  • Shannon B. Seidel , and
  • Integrating assessment matrices in feedback loops to promote research skill development in postgraduate research projects Assessment & Evaluation in Higher Education, Vol. 38, No. 5
  • Orchestrating learning activities using the CADMOS learning design tool 16 September 2013 | Research in Learning Technology, Vol. 21
  • Using rubrics in economics International Review of Economics Education, Vol. 12
  • Implementation the assessment rubrics to evaluate the outcomes of PBL and ABL process
  • The explication of quality standards in self-evaluation Assessment in Education: Principles, Policy & Practice, Vol. 19, No. 3
  • Measuring beyond content: a rubric bank for assessing skills in authentic research assignments in the sciences 1 January 2012 | Chem. Educ. Res. Pract., Vol. 13, No. 3
  • Teaching Creative Science Thinking Science, Vol. 334, No. 6062
  • Fostering Student Engagement in an Online IR Course 20 August 2011
  • Student attitudes toward the assessment criteria in writing-intensive college courses Assessing Writing, Vol. 16, No. 1
  • Rubric Evaluation of Pediatric Emergency Medicine Fellows Journal of Graduate Medical Education, Vol. 2, No. 4
  • Bridging the educational research‐teaching practice gap 28 January 2010 | Biochemistry and Molecular Biology Education, Vol. 38, No. 1
  • Innovations in Teaching Undergraduate Biology and Why We Need Them Annual Review of Cell and Developmental Biology, Vol. 25, No. 1
  • The Development of a Competency-Based Assessment Rubric to Measure Resident Milestones Journal of Graduate Medical Education, Vol. 1, No. 1
  • International Journal of Palliative Nursing, Vol. 15, No. 7
  • Alison Crowe ,
  • Clarissa Dirks , and
  • Mary Pat Wenderoth
  • Marshall Sundberg, Monitoring Editor
  • Katayoun Chamany ,
  • Deborah Allen , and
  • Assessing student performance during experiential rotations American Journal of Health-System Pharmacy, Vol. 65, No. 16
  • Digital storytelling: a meaningful technology-integrated approach for engaged student learning 11 April 2008 | Educational Technology Research and Development, Vol. 56, No. 4
  • Daryl D. Hurd
  • Jeffrey Hardin, Monitoring Editor
  • Positioning Preservice Teacher Formative Assessment in the Literature

© 2006 by The American Society for Cell Biology

We gratefully acknowledge the contribution of Richard Donham (Mathematics and Science Education Resource Center, University of Delaware) for development of the double-digit rubric in Figure 2.

Logo for University of Wisconsin Pressbooks

TeachOnline@UW: Rubrics – Advantages and Best Practices

Types of Rubrics

Analytic rubrics.

Analytic Rubrics  feature a grid of “criteria” (columns) and “levels” of achievement (rows). The instructor assigns points or weights to particular criteria, and then evaluates student performance in each area. This is useful in providing feedback on areas of strength and weakness. Because of this, analytic rubrics take more time to develop than a holistic rubric.  See example of an analytic rubric.

Analytic rubrics are particularly useful for problem-solving or application assessments because a rubric can list a different category for each component of the assessment that needs to be included, thereby accounting for the complexity of the task. For example, a rubric for a research paper could include categories for organization, writing, argument, sources cited, depth of content knowledge, and more. A rubric for a presentation could include categories related to style, organization, language, content, etc. Students benefit from receiving rubrics because they learn about their relative strengths and weaknesses.

What are the advantages of using an analytic rubric?  Evaluate the following statements.

Holistic rubrics.

Holistic Rubrics  describe characteristics of each level of performance for an assignment or activity overall (e.g. characteristics of an excellent research paper). See an example of a holistic rubric.

Holistic rubrics are best to use when there is no single correct answer or response and the focus is on overall quality, proficiency, or understanding of a specific content or skills.

What are the advantages of using a holistic rubric?  Evaluate the following statements.

You want to assign a score based on an overall judgement of your students’ work. Would you choose an analytic rubric or a holistic rubric?

You want to assign points based on achievement level for several criteria. would you choose an analytic or a holistic rubric.

Rubrics: Advantages and Best Practices Copyright © by Karen Skibba. All Rights Reserved.

AlludoLogo_Lemur whitetext

Analytic vs. Holistic Rubrics: Which Type of Rubric Should You Use?

Damon Torgerson : Aug 7, 2022 4:00:00 PM

Analytic vs. Holistic Rubrics: Which Type of Rubric Should You Use?

“One test of the correctness of educational procedure is the happiness of the child.” ~Maria Montessori

Good teachers use an array of tools to assess student learning, provide feedback, and improve their teaching skills. The use of rubrics is something that has become increasingly popular – but the question is, what type of rubric should you use, and when?

The Alludo PD content catalog includes a variety of microlearning activities related to the creation and use of rubrics. One of the most important questions to ask before designing a rubric is whether to use a holistic rubric or an analytic rubric. In this post, we’ll explore the components of analytic vs. holistic rubrics, including which type is easiest to create, when to use them, and the pros and cons of each type.

Table of Contents

Advantages of holistic rubrics, disadvantages of holistic rubrics, advantages of analytic rubrics, best time to use a holistic rubric, best time to use an analytic rubric.

  • Alludo's Take
  • Equip Teachers in Your District to Develop Rubrics

What is a Holistic Rubric?

A holistic rubric is a rubric where all elements of a student’s work are evaluated together using a single scale. Students are assigned a point score based on an overall judgment of the work presented. There may be many elements that affect the score, but these things are not scored individually in a holistic rubric.

In general, holistic rubrics are useful when a teacher wants to grade a student’s general progress and performance and doesn’t need the specificity that would come with a different type of rubric. Holistic rubrics are typically scored on either a scale of 1-4 or a scale of 1-6. Holistic rubrics have their advantages but they’re not ideal for every assignment or situation.

holistic rubric for math problem solving

Holistic rubrics can be useful for evaluating a student’s general performance and grasp of the subject matter being taught. 

Emphasize What Learners Can Do

Holistic rubrics put the emphasis on what learners can do as opposed to what they cannot. Marking a rubric by focusing on student achievement can boost learners’ confidence and help them feel good about their work while still offering room for improvement.

Easy to Create and Use

Compared to some other types of rubrics, holistic rubrics are less time-consuming to create and use because there is a single scale. Students understand that they will be given one overall point score and raters can assess the results quickly.

Consistent and Reliable

Holistic rubrics tend to have more consistent ratings than other rubrics because scores can be applied consistently by trained raters. Consistent scoring of rubrics makes scores more reliable and that’s useful to both teachers and students.

Here are some of the disadvantages of holistic rubrics.

No Place for Specific Feedback

The primary disadvantage of holistic rubrics from the perspective of learners is that there is no room for them to receive specific feedback about elements that might be scored separately with a different type of rubric. Targeted feedback helps students improve and holistic rubrics don’t provide it.

Scoring Can Be Challenging

Because scorers must assign a single score for the entire assignment, it can be difficult to nail down a single score for work that spans multiple criteria. Scorers must find a way to evaluate the work even when some areas of the student’s work are excellent and others need improvement. 

Criteria Are Not Weighted

When scoring a holistic rubric, the criteria are not weighted. The assignment must be evaluated as a whole and that can be disadvantageous because students may not understand what they’ve done well and what needs improvement. It can also make the scoring process more difficult for teachers because there’s no one-size-fits-all way to calculate an appropriate score that takes all elements of the presentation into account.

AL_11-Blog04-2

What is an Analytic Rubric?

In contrast to a holistic rubric, an analytic rubric is scored using a grid that outlines the criteria for a student assignment. Each criterion should be in a separate row and each potential score in a separate column.

The levels of student performance are typically assigned a number and may also have descriptive tags such as Above Average, Sufficient, Developing, or Needs Improvement. The cells in the center of the grid may be used to describe the details of what the criteria would look like for each potential score. 

Unlike what happens with a holistic rubric, analytic rubrics allow for the separate scoring assessment criteria. Common criteria include the following:

  • Clarity : Is the thesis supported by relevant information and ideas?
  • Organization : Is the information presented in a logical order that helps the presentation flow?
  • Mechanics : Is the grammar and spelling correct or does it distract from the presentation?

Teachers may want to use the space in the middle of the graph to spell out details that support the score. For example, a high score in mechanics would be supported if the student made no or few spelling and grammatical errors.

New call-to-action

Using analytic rubrics has some advantages that accrue to both teachers and students. Here are the most important advantages to consider.

Feedback on Strengths and Weaknesses

In an analytic rubric, students receive scores that reflect their individual strengths and weaknesses. Specific feedback allows students to identify areas where they need improvement and focus on them for future assignments. This feature is also useful for teachers who are able to pinpoint areas where students may need help to meet expectations.

Criteria Are Weighted

Each criterion that makes up the rubric is considered in the overall grade. A student who struggles with spelling may make up for a low score in mechanics if they do an excellent job of presenting the information in an organized manner with facts that support their thesis. Weighting criteria also makes scoring easy for teachers, particularly if the middle of the rubric grid is used to spell out what elements must be present (or missing) to earn each score.

Disadvantages of Analytic Rubrics

There are two main disadvantages to consider when designing an analytic rubric and both are important.

Time-Consuming to Create

Analytic rubrics take more time to create than holistic rubrics because they have more parts and are more complex due to the need to define and score individual elements of the student’s work. Teachers may spend time detailing the specifics of what each score means and that can be time-consuming as well.

Consistency Can Be an Issue

Consistency can be an issue with analytic rubrics unless the person creating them takes the time to define each element and the criteria to be used to judge it. Inconsistent scoring can undermine trust in the scorers.

AL_11-Blog04-3

When Should You Use a Holistic vs. Analytic Rubric?

The choice of whether to use a holistic or an analytic rubric is important because each has its uses and advantages. In most cases, the assignment itself will dictate which rubric will be most helpful in scoring students’ work.

The best time to use a holistic rubric is when creating an assignment where there is no single, correct answer or response. Your goal is to get students to think through a problem and present what they have learned and considered in a compelling way.

Said another way, holistic rubrics are most useful if you want to grade students’ work based on its overall quality or their overall understanding of concepts and information. Providing feedback based on individual criteria is less important than gauging a student’s progress and general performance.

The best time to use an analytic rubric is when you need to evaluate students’ work in multiple areas and want to assess their proficiency or progress in each.

Analytic rubrics are useful for problem-solving assignments or projects with multiple components. An example would be a student presentation that would be graded on content, language, organization, style, and other elements.

You should use an analytic rubric if you want to show students their relative strengths and weaknesses with an eye toward helping them improve where improvement is needed.

AL_11-Blog04-4

Alludo’s Take

Alludo partners with school districts around the country to help them provide dynamic and engaging professional learning for teachers, staff, and administrators. We understand that teachers who are familiar with and flexible about the type of rubric they use can work most efficiently. It’s for that reason we have included missions and micro-learning activities about rubrics in our content catalog .

In addition to including information about various types of rubrics and how to create them, we have also included activities to help teachers learn about rubric assessments and how to apply scores consistently.

rubricblog

In districts using the Alludo platform, teachers are engaged because we give them a voice and a choice in what they learn. By incorporating gamification and a system of rewards, we make professional development both accessible and fun.

Equip Teachers in Your District to Develop Effective Rubrics

Developing rubrics requires careful planning and an understanding of analytic vs. holistic rubrics, including the advantages and disadvantages of each. The information we’ve included here can help you determine which type of rubric is best for an assignment.

Experience personalized learning for all levels of educators with a free trial of Alludo’s professional development platform. You’ll enjoy:

  • Hundreds of core topics
  • Asynchronous microlearning activities
  • Timely and specific feedback
  • Analytics that show learning impact
  • Access anytime, anywhere

Play Alludo For Free Today

Beyond Plagiarism: The Untapped Potential of AI in Closing the Achievement Gap

What's New in the Alludo Catalog ?

What's New in the Alludo Catalog ?

“It would take us years to roll out all the PD that we can on Alludo." - Kathy Jackson, Director of Teaching and Learning for K-12, YCJUSD

Revolutionizing Education: The Urgent Call for AI Integration in Our Schools

Revolutionizing Education: The Urgent Call for AI Integration in Our Schools

IMAGES

  1. 3 and 2 Point Holistic Rubrics

    holistic rubric for math problem solving

  2. Math Problem Solving Rubric by jacquelyn Daidone

    holistic rubric for math problem solving

  3. Math problem solving rubric

    holistic rubric for math problem solving

  4. Math Problem Solving Rubric and Checklist For grades 5

    holistic rubric for math problem solving

  5. What Is Hollistic Rubric

    holistic rubric for math problem solving

  6. MATH Problem Solving Rubric by iTeachSTEM

    holistic rubric for math problem solving

VIDEO

  1. Mathematics solver

  2. How To Solve Math Problems

  3. Holistic Rubric

COMMENTS

  1. PDF Mathematics Problem Solving Rubric

    Sample Holistic Rubric Mathematics Problem Solving Rubric (Designed for Kindergarten and First Grade) Adapted from: http://www.esd112.org/smerc/pdf/Math-Grade1.pdf 4. Exceeds all relevant criteria (exceeds standard) Thoroughly understands the problem Uses only information that is applicable to the problem Uses all math concepts and procedures

  2. PDF Holistic Rubric Problem Solving

    Holistic Rubric Problem Solving. Understands the Problem and Develops a Problem-Solving Plan. Understands the Problem and Develops a Problem-Solving Plan. Carries Out the Plan and Reviews the Results. 4. Identifies the relevant information needed to solve the problem. Selects an appropriate solution method and develops a comprehensive plan for ...

  3. PDF Guide to Scoring Rubrics

    Despite the overwhelming number of scoring rubrics you can find on the Internet and in various textbooks and curriculum guides, most rubrics fall into one of two categories: Analytic or holistic scoring rubrics. Analytic scoring rubrics Analytic rubrics attempt to break down the final product or goal into measurable components and parts.

  4. PDF 201 Math Problem Solving

    Chicago Public Schools Bureau of Student Assessment 205 North Carolina Math Rubric II Source: North Carolina Department of Public Instruction Subjects: Mathematics # of scales 1 Grade(s) Elementary Scale length 53 Holistic Scale 2 Answer is complete and correct; all parts of the question are addressed. 1 Student gives a partially correct answer, or task is incomplete (i.e., one of two parts

  5. PDF Holistic Rubric Problem Solving

    Demonstrates the use of most of the mathematical concepts, processes, and skills necessary to carry out the problem-solving plan. Answers the question in the problem but does not check for the reasonableness of the solution. 2. Identifies some of the relevant information needed to solve the problem. Selects a solution method but does not ...

  6. PDF Mathematics General Scoring Rubrics

    Smarter Balanced Mathematics General Rubric for 2-Point Items. Score. Description. 2. The student has demonstrated a full and complete understanding of all mathematical content and practices essential to this task. The student has addressed the task in a mathematically sound manner. The response contains evidence of the student's competence ...

  7. PDF Using a Holistic Rubric

    Holistic Rubric, Problem Solving Analytic vs. Holistic Scoring Which One Should I Use? Rubric Scoring Process Activity Sheet(s). Holistic Rubric, Problem Solving Analytic Rubric, Problem Solving Student Work Samples D, E, F, and G References: TEXTEAMS Practiced-Based Professional Development: Middle School Assessments (2003).

  8. PDF MCAP Mathematics: Holistic Rubric for 3-Point Modeling Constructed

    This holistic rubric guides the evaluation of a student response by providing descriptions of sample characteristics for each score point. A score is based on an overall analysis of what is included in a student's response rather than what is missing. It is not necessary for a response to include all of the sample characteristics.

  9. Math Rubrics

    Math Rubrics. Rubrics are a tool for evaluating student work. There are two main types of rubrics: holistic (a rubric that provides one overall score) and analytic (a rubric that provides scores for different categories). Most of the rubrics in the SFUSD Math Core Curriculum are 4-point holistic rubrics. A sample 4-point holistic rubric:

  10. Examples of Rubric Creation

    Holistic rubrics can be an efficient, consistent, and fair way to grade a problem set. An analytical rubric often gives a more clear picture of what a student should direct their future learning efforts on.

  11. PDF MATH PROBLEM SOLVING RUBRIC Name

    Problem Solving Strategies: Guess, check, & revise Work backwards Draw a picture/diagram Look for a pattern. Solve a simpler problem Use objects / Act it out Use an organized list. Grades 3- 6. Make a table Use logical reasoning Use a number sentence, equation, or formula. Approved by Curriculum Council October 26, 2004 Revised April, 2005.

  12. iRubric: 5

    Homework Holistic Rubric. (N/A) 5-Points: N/A. The problem is completely correct and all work is shown for the problem. 4-Points. N/A. The problem is completely correct but no work is shown, or the problem is correct except for a minor mistake at the end of the solution. 3-Points.

  13. PDF Problem Solving Rubric

    It is blank. The student response only repeats information in the problem task. An incorrect solution/response is given and no other information is shown. The solution/response and supportive information is totally irrelevant to the problem task. Reference: National Center for Research on Evaluation, Standards, and Student Testing (CRESST)

  14. PDF Holistic Rubric Samples

    Holistic Rubric Samples Writing - Grade 2 Example. This rubric could be used to assess an expository piece of writing. Reading - This rubric could be used to assess the oral reading fluency of any student kindergarten through third grade.

  15. PDF Simplified Rubric for Assessing PROBLEM SOLVING

    Simplified Rubric for Assessing PROBLEM SOLVING Details Behind Simplified Rubric Novice Developing Proficient Problem Solving: Students will design, evaluate, and implement a strategy to answer open-ended questions or achieve desired goals. Student demonstrates a limited ability to identify a problem statement and approaches for

  16. PDF The Holistic Critical Thinking Scoring Rubric

    The Holistic Critical Thinking Scoring Rubric (HCTSR) is a rating measure used to assess the quality of critical thinking displayed in a verbal presentation or written text.

  17. PDF Fundamentals of Rubrics

    A holistic rubric (See Table 2) requires the measurer to score the overall process or product as a whole, without judging the component parts separately (Nitko, 2001). Performance expectations and criteria that are holistic in nature (i.e., problem-solving) are best measured and evaluated using a holistic rubric. Holistic rubrics are

  18. PDF Holistic Rubric for Basic Fact Fluency 2 3 4

    On the rubric, you may notice that there is a big leap between a score of 2 and 3. This is the point where students scoring a 1 or 2 need explicit strategy instruction and are considered not quite there yet. Students scoring a 3 or 4 will benefit from ongoing practice through games and discussions. Holistic Rubric for Basic Fact Fluency

  19. Rubrics: Tools for Making Learning Goals and Evaluation Criteria

    The maximum score using either the holistic or the analytical rubric would be 10, with 2 points possible for each of five criteria. The holistic rubric outlines five criteria by which student responses are evaluated, puts a 3-point scale on each of these criteria, and holistically describes what a 0-, 1-, or 2-point answer would contain.

  20. A Three-Point Holistic Rubric for Assessing Mathematical ...

    3 and 2 Point Holistic Rubrics - Math - Free download as Word Doc (.doc), PDF File (.pdf), Text File (.txt) or read online for free. 3 and 2 Point Holistic Rubrics

  21. Types of Rubrics

    This is useful in providing feedback on areas of strength and weakness. Because of this, analytic rubrics take more time to develop than a holistic rubric. See example of an analytic rubric. Analytic rubrics are particularly useful for problem-solving or application assessments because a rubric can list a different category for each component ...

  22. Analytic vs. Holistic Rubrics: Which Type of Rubric Should You Use?

    In general, holistic rubrics are useful when a teacher wants to grade a student's general progress and performance and doesn't need the specificity that would come with a different type of rubric. Holistic rubrics are typically scored on either a scale of 1-4 or a scale of 1-6. Holistic rubrics have their advantages but they're not ideal ...

  23. Analytic Rubric for Problem Solving in Mathematics

    analytic rubric criteria exemplary proficient developing limited understandi ng of concepts demonstrate deep and comprehensi ve understandin of all mathematica. ... Analytic Rubric for Problem Solving in Mathematics. analytic rubric. Course. Selected Topics in Math (EDSC253) 46 Documents. Students shared 46 documents in this course.