University of Minnesota
http://www.umn.edu/
612-625-5000

Physics Education Research and Development Group

Open-ended problem analysis (Coding Rubric)

This description of the problem-solving coding rubric is taken from Tom Foster's PhD thesis.

For Jennifer Docktors problem-solving rubric and coding, go to here.

Measuring students problem solving ability is a non-trivial task. Many researchers of classroom physics problem-solving use student grades as such a measure. While others do a controlled grading where they have several expert physicists grade identical solutions, compare their respective scores, then reach consensus on differences. Both methods fall short of measuring problem-solving ability. Instead what they measure is the correctness of a solution. A student who chose to apply the wrong physics concepts, yet applied them correctly, typically received a low grade. Conversely, a student who managed to reach a correct solution by manipulating all of the given information in a haphazard manner would probably receive a high grade, yet failed to display a desirable problem-solving ability. This is not meant to imply that grades are without merit, only that grades measure something different than problem-solving ability. For this study, a more sophisticated technique needed to be applied.

To develop a technique to measure problem-solving ability, the first step is to decide what is meant by "problem-solving ability." For physicists, the only meaningful definition of problem-solving ability would have their students solve problems like physicists. Fortunately, the expert-novice problem solving literature has already illustrated how expert physicists solve problems. From this literature, four relevant behaviors are evident in expert solutions: (1) expert perform an initial qualitative analysis of a problem (Larkin, 1979); (2) experts use their initial analysis to create a domain specific representation (Larkin & Reif, 1979); (3) experts work from general principles to the desired goal (Larkin, 1980); and (4) good problem-solvers plan their solution before starting it (Finegold & Mass, 1985). From these four behaviors, a problem-solving ability coding rubric was developed.

The problem solving ability coding rubric used in this study has four dimensions. These dimensions and sub-codes are in Tables 1 through 4. The first dimension was General Approach (GA). This dimension assessed the students initial qualitative approach. It is this dimension that any conceptual error the student made will be reported. The second dimension is Specific Application of Physics (SAP). This dimension is the assessment of the student domain-specific knowledge. A student's SAP is dependent upon their GA, so even if the concepts applied are not wholly appropriate for a successful problem solution; the application of those concepts are still judged. The third dimension is Logical Progression (LP). This dimension codes a student's cohesiveness of the solution. It also measures whether a students woks forward or backwards (Larkin, 1980). The final dimension of the coding rubric is Appropriate Mathematics (AM). This dimension accounts for a students level of mathematical skill as applied to the specific problem. It is essentially a judge of a students ability to transfer the mathematics they learned in the context of a math class to the new context of physics. Each of the dimensions will be coded and reported separately.

Table 1: General Approach

0. Nothing written
1. Physics approach is inappropriate. Successful solution is not possible
2. Physics approach is appropriate, but the manner of its application indicates a fundamental misunderstanding.
3. Physics approach is appropriate, but a wrong assertion is made as a serious misinterpretation of given information.
4. Physics approach is appropriate, but neglects one or more other principles necessary for the solution
5. Physics approach is appropriate and all necessary principles included, but errors are evident.
6. Physics approach is appropriate and all necessary principles included without any conceptual errors.

Table 2: Specific Application of Physics

0. Nothing written.
1. Difficult to assess (GA#2).
2. Solution does not proceed past basic statement of concepts.
3. Vector/scalar confusion.
4. Specific equations are incomplete.
5. Confusion resolving vectors into components.
6a. Wrong variable substitution: Poor variable definition.
6b. Wrong variable substitution: Difficulty in translating to a mathematical representation.
7a. Careless use of coordinate system without a coordinate system defined.
7b. Careless use of coordinate system with a coordinate system defined.
8. Careless substitution of given information.
9a. Specific equations do not exhibit clear inconsistencies with the general approach, but hard to tell due to poor communication.
9b. Specific equations do not exhibit clear inconsistencies with the general approach and the solution is clear.

Table 3: Logical Progression

0. Nothing written.
1. Not applicable - one step problem.
2. The use of equations appears haphazard and the solution unsuccessful. Student may not know how to combine equations.
3a. Solution is logical to a point, then an illogical jump is made. Student may abandon earlier physics claims to reach answer.
3b. Solution is somewhat logical, but frequent unnecessary steps are made. Student may abandon earlier physics claims to reach answer.
4. Solution is logical, but unfinished. Student may stop to avoid abandoning earlier physics claims.
5. Solution meanders successfully toward answer.
6. Solution progresses from goal to answer.
7. Solution progresses from general principles to answer.

Table 4: Appropriate Mathematics

0. Nothing written
1. Solution terminates for no apparent reason
2a. When an obstacle happens, "math magic" or other unjustified relationships occurs
2b. When an obstacle happens, solution stops.
3. Solution violates rules of algebra, arithmetic, or calculus
4. Serious math errors
5a. Mathematics is correct, but numbers substituted at each step
5b. Mathematics is correct, but numbers substituted at last step.
  • © 2012 Regents of the University of Minnesota. All rights reserved.
  • The University of Minnesota is an equal opportunity educator and employer. Privacy
  • Last modified on October 15, 2012