0% found this document useful (0 votes)
19 views

Enhancing Mathematics Teachers Knowledge

Uploaded by

Akhmad alfan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views

Enhancing Mathematics Teachers Knowledge

Uploaded by

Akhmad alfan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

SHUHUA AN and ZHONGHE WU

ENHANCING MATHEMATICS TEACHERS’ KNOWLEDGE


OF STUDENTS’ THINKING FROM ASSESSING AND ANALYZING
MISCONCEPTIONS IN HOMEWORK
Received: 26 November 2007; Accepted: 20 October 2011

ABSTRACT. This study focuses on teacher learning of student thinking through grading
homework, assessing and analyzing misconceptions. The data were collected from 10
teachers at fifth–eighth grade levels in the USA. The results show that assessing and
analyzing misconceptions from grading homework is an important approach to acquiring
knowledge of students’ thinking. By engaging in the inquiry process of the 4 steps of
identifying errors, analyzing reasons for the errors, designing approaches for correction,
and taking action for correction, the teachers made obvious progress in their knowledge of
students’ thinking, understood the difficulties and challenges their students had in learning
mathematics, and enhanced their pedagogical content knowledge.

KEY WORDS: analysis of misconceptions, assessment, grading homework, knowledge of


student thinking, pedagogical content knowledge, teacher learning

INTRODUCTION

According to the National Research Council (NRC) (2001), classroom


teachers should engage in “inquiry to deepen their understanding of
students’ thinking” (p. 389). Knowledge of students’ thinking is a major
component of pedagogical content knowledge of mathematics teaching
(Shulman, 1987; An, Kulm & Wu, 2004). It furnishes specific insight that
allows teachers to gauge how well students understand mathematical
concepts, understand possible misconceptions and their patterns, and
develop comprehensible strategies to correct these misconceptions. It will
be beneficial to classroom teachers if they can assess their students’
thinking on a daily basis by analyzing their work, thus enhancing their
pedagogical content knowledge of mathematics teaching (An & Wu,
2008).
How to assess students’ thinking in mathematics by classroom teachers
is a main concern in current research. Little up-to-date information is
available on how US teachers conduct internal assessment in mathematics
(NRC, 2001). An & Wu (2008)’s study shows that the internal assessment
can be achieved from assessing students’ thinking by analyzing student
homework because homework often involves knowledge that is “in

International Journal of Science and Mathematics Education (2012) 10: 717Y753


# National Science Council, Taiwan 2011
718 SHUHUA AN AND ZHONGHE WU

process” of being internalized. Such an approach can provide a lens for


the teacher in diagnosing learner problems, through which the teacher can
view students’ thinking at a deep internal level and provide timely
feedback to clarify misconceptions (An & Wu, 2008).
A comparative study on mathematics teachers’ pedagogical content
knowledge conducted by An (2000) reveals that some US teachers do not
really examine homework at all in the same way as Chinese teachers do.
Because the time US teachers have available to review homework is
limited (An & Wu, 2008), many of them grade their students’ homework
by “complete or incomplete,” or provide answers to students at the
beginning of class for their self-correction or group correction. In
contrast, Chinese teachers not only evaluate each problem but also
correct mistakes “face to face” and analyze patterns of errors in
homework. An’s study suggests that more attention could be paid to
assessing and understanding students’ thinking from examining
students’ homework, thereby improving the effectiveness of their
instruction (An, 2004).
Following the suggestions of the research study, to help mathematics
teachers acquire deep knowledge of their students’ thinking through a
specific strategy, this study examined whether engaging in an assessment
process of grading student homework builds such knowledge and
investigated effective approaches of analyzing student work, given the
limited time in daily teaching in the USA. The following research
questions were posed in this study: How can mathematics teachers
acquire knowledge of students’ thinking by carefully grading students’
homework and analyzing errors? What are the effects of grading
homework and analyzing errors on teachers’ learning and understanding
of students’ thinking?

CONCEPTUAL FRAMEWORK

Teachers’ Pedagogical Content Knowledge of Student Thinking


Teaching mathematics well is a complex endeavor, and “effective
teaching requires knowing and understanding mathematics, students as
learners, and pedagogical strategies” (NCTM, 2000, p. 17). For many
years, research has paid attention to teachers’ knowledge (Fennema &
Franke, 1992; Cooney & Shealy, 1997; Ma, 1999; An, 2000, 2004; Hill,
Rowan & Ball, 2005). Some studies have posited the importance of
profound mathematical content knowledge (Ma, 1999; Spitzer, Strage &
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 719

Bergthold, 2007). Others have called the content knowledge specialized


mathematics knowledge (Hill et al., 2005), and later, it is referred to as
“mathematical knowledge for teaching,” meaning the mathematical
knowledge needed to carry out the work of teaching mathematics (Ball,
Thames & Phelps, 2008). However, no matter what names knowledge of
teaching has been given, the successful teachers should acquire essential
knowledge in teaching—pedagogical content knowledge (Shulman, 1987;
Fennema & Franke, 1992; Wilson, Floden & Ferrini-Mundy, 2002) because
mathematics knowledge is not the only issue in teaching and learning and
pedagogical content knowledge is also critical (Walshaw, 2002).
A study by An et al. (2004) echoes the need for pedagogical content
knowledge and agrees that the primary knowledge for mathematics
teachers is the pedagogical content knowledge that connects content and
pedagogy in four dimensions: building on students’ ideas in mathematics,
addressing students’ misconceptions, engaging students in mathematics
learning, and developing student thinking about mathematics.
Various studies suggest that teachers with a better understanding of
children’s thinking develop profound mathematical understanding (Sowder,
Philipp, Armstrong & Schappelle, 1998). The importance of knowledge of
student thinking is also seen in Shulman (1986) and Fennema & Franke
(1992)’s models of pedagogical content knowledge. Shulman (1986)
included knowledge of learners and their characteristics as a part of
pedagogical content knowledge and suggested that teachers should
understand both the level of difficulty of a particular concept and the prior
knowledge students usually bring to these topics and also have a grasp of
alternative representations for particular content. Further, Fennema & Franke
(1992) included students’ thinking in their pedagogical content knowledge
model that has five parts: (1) content, (2) pedagogy, (3) learners’ cognitions,
(4) content-specific knowledge, and (5) teachers’ beliefs. An et al. (2004)’s
study also asserts the knowledge of student thinking as central to
pedagogical content knowledge.
Knowledge of students’ thinking requires specific knowledge that
allows teachers to know how well students understand mathematics
concepts, understand possible misconceptions and their patterns, and
develop comprehensible strategies to correct misconceptions. Teachers
should “engage not only in inquiry about how to apply knowledge about
students’ thinking in planning and implementing instruction but also in
inquiry to deepen their understanding of students’ thinking” (NCTM,
2000, p. 389).
Although there is growing evidence that knowledge of students’
thinking is grounded in practice (Falkner, Levi & Carpenter, 1999), too
720 SHUHUA AN AND ZHONGHE WU

little of the extant research probes the important and specific strategy of
grading homework to acquire knowledge of students’ mathematical
thinking (Cooper, 1989b). It is necessary to investigate the effective
approaches to gain students’ mathematical thinking from their work.

Approaches to Assessing Students’ Thinking from Grading Homework


Coulter (1979) suggests paying attention to actual teacher behavior in the
three phases of the homework process model: introduction, structuring,
and follow-up. In the follow-up phase, teachers grade homework, correct
it, provide feedback, and test whether the results relate to other class
work. Based on Coutler’s model, grading homework in this study refers
to teacher behavior in the follow-up homework phase involving grading,
correcting, providing feedback to homework, and analyzing error patterns
(An, 2004).
Cooper (1989b) indicates that there is no superior grading technique
for homework and suggests using alternative feedback strategies such as
assigning grades and providing instructional or evaluative comments
without grades. Paschal, Weinstein & Walberg (1984) conclude that the
effects of homework are much stronger when the teacher grades or
comments on assignments. In Austin’s (1976) study, the effect of
feedback was related to regular teacher-constructed examinations of the
work concerned in the classroom and at home over six periods of study,
which agrees that teacher comments on homework papers might improve
student achievement. The above intervention strategies involve a common
strategy of grading homework—providing feedback to homework.
However, these studies did not address explicitly teacher learning from
grading homework. Important questions such as, “Can grading homework
enhance teachers’ knowledge of students’ thinking?” and “How does
grading homework change teachers’ ability of analyzing errors?” have no
clear-cut answers. The inconsistency might be due to the generally poor
quality of research (Cooper, 1989b) and “the limited research on
homework” that results in findings of either the quality or the benefits
of homework (NRC, 2001, p. 426). In addition, Stiggins, Frisbie &
Griswold (1989) postulate three general reasons for common discrep-
ancies in grading homework. One of the reasons is that teachers lack
training or expertise in sound practices, which implies that teachers need
to be guided in finding an effective approach to grading homework. An’s
(2004) study confirms this need in the US and addresses concerns that US
teachers have less time for grading homework compared to their Chinese
counterparts, while Chinese teachers in the study who have sufficient time
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 721

to grade students’ homework not only grade each problem one by one but
also “record and analyze the mistakes in student work, thereby establish-
ing the patterns of errors” (p. 208). An’s study indicates that with
understanding of student misconceptions, 75% of Chinese teachers in the
study were able to plan their instruction according to their students’
needs, while only 7% of the US teachers did so. This finding challenges
researchers to find an effective approach to helping US teachers obtain
knowledge of student thinking from grading and analyzing homework,
thus assisting their instruction planning and teaching.

Teacher Learning of Students’ Thinking from Analysis of Students’


Homework
The reflective thought process allows teachers to learn and know
students’ thinking (Fennema, Carpenter & Lamon, 1991). One of the
approaches of the reflective thought process includes analysis of students’
work (An, 2004). Furthermore, through analyzing patterns of misconcep-
tion from homework, “teachers are able to adjust instruction according to
students’ needs, develop students’ understanding, and enhance the
effectiveness of their teaching” (An, 2004).
In a teacher learning and change study, Wu (2004) describes the need
for engaging teachers in the reflective thought process and shows that
teachers, in their professional development, would like to learn how to
analyze students’ misconceptions. Some studies have also indicated that
students’ misconceptions can also serve as catalysts in learning
mathematics (Empson & Junk, 2004) and as a springboard for inquiry
(Borasi, 1994). In Borasi’s (1994) error analysis study, 11 instructional
experiences for teachers were developed and analyzed. In these
instructional experiences, the teacher made a conscious effort to capitalize
on errors, as learning opportunities, by planning instruction activities
focusing on study of selected errors and by making impromptu use of
errors made by the students. A teacher who used knowledge of student
thinking “saw, within the child’s mistakes, productive, albeit informal,
thinking about division, and described an interaction that could build on
her thinking” (Empson & Junk, 2004, p. 138).
According to Fennema et al. (1991), teacher learning is influenced by
knowing children’s pre-instructional knowledge. To understand the
effects of new instruction, teachers need to first understand the nature
of pre-instructional knowledge and investigate how it influences what
children learn. In addition, “teachers must also be alert to the possibility
that a student’s actual knowledge may have ill-formed, fragile, or missing
722 SHUHUA AN AND ZHONGHE WU

concepts …” (Graeber, 1999, p. 192). One way of understanding pre-


instructional knowledge and detecting such misconceptions is to analyze
students’ thinking by grading homework and to use it to link and
reinforce students’ learning from previous instruction (Cooper, 1989a).
The importance of analysis of students’ homework and providing
feedback from grading homework was evident in several studies. Early
work in the area done by Bandura (1977), Andrewa & Debus (1978), and
Rumelhart (1980) indicates that analysis can influence self-efficacy and
persistence, lead to knowledge refinement, and initiate the correction of
misconceptions. An empirical study conducted by Gagne, Crutcherm,
Joella, Geisman, Hoffman, Schutz & Lizcano (1987) shows that the
provision of feedback following errors increases the probability of
learning. This finding supports the study by Schoen & Kreye (1974)
which found that feedback specific to the student’s errors has a positive
influence on students’ scores. The consistency of analyzing errors had
also been supported by the findings of Borasi (1994) which
concluded that “using errors as springboards for inquiry can offer
mathematics teachers a viable instructional strategy to achieve many
of the goals” (p. 2000).
The current study explores an effective approach to building profound
pedagogical content knowledge of students’ thinking from grading
homework for a group of US teachers at fifth–eighth grade levels. In
the proposed study, homework is defined as a task that provides students
an opportunity to practice and reinforce their learned knowledge and
skills in order to be ready for new lessons; grading homework refers to a
teacher evaluating students’ understanding of their independent practice
and analyzing their patterns of misconceptions. This study proposes that
grading homework can become a more powerful tool for teaching if
teachers are provided with an opportunity to be trained to use an effective
way to carefully grade homework and analyze it for insights into how
students are making sense of instruction. Because the limited time in a
teaching day in the USA has resulted in the current homework grading
practice in the USA, teachers do not have time to review and grade each
problem during daily grading of homework. Most of the time, they glance
at their students’ work and give scores for completion or effort;
sometimes, they provide answers and have students self-grade or group
grade, which cannot provide teachers a complete picture of the effects of
their teaching on students’ learning. Considering this ineffective way of
grading, this study designs a sampling method to help classroom teachers
in grading homework and engaging in assessing and knowing their
students’ thinking of mathematics.
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 723

METHODOLOGY

Subjects
The participants were ten teachers from four schools (three middle
schools) from fifth to eighth grade levels in the area of Los Angeles
County in California in the USA. All ten classes have quite diverse
student populations both linguistically and ethnically. In three middle
schools, more than 60% of students did not achieve mathematics
proficiency on the California Math Standards Test. To control confound-
ing variables, a matched design was used to match teachers in two groups
by the following variables: (a) having the same percentage of students
below proficiency level in mathematics, (b) having at least 3 years of
teaching experience at the grade levels 5 thru 8, (c) having at least 25%
second language learners, and (d) teaching the same content area at the
same grade level. Table 1 shows the demographic information of class
size, percentages of high, middle, and low students in each class at the
same grade level, including the teachers’ educational background and
teaching experience.

Procedures
Ten teachers were divided into two groups of five teachers, and each
group had teachers at different grade levels from fifth to eighth. The
researchers worked with school leaders to identify volunteer teachers and
match the teachers for the samples in the groups. Considering that US
teachers usually do not have sufficient time to grade all students’
homework daily, the teachers from the experimental group in this study
were trained in how to group their students into low, middle, and high
levels according to student achievement on California Standards Test
scores and how to select students from each group daily in order to have a
wide range of representation from different levels of students. They used
a two-step sampling method for grading student homework: (1) Analyze
students’ California Standards Test (CST) scores and district pre-tests to
divide students into three groups: “high” = above proficiency, “middle” =
proficiency, and “low” = below proficient scores on the CST; (2) select
one student each from the high and low groups and two students from
middle group randomly daily. Figure 1 shows a sample of grading records
from an eighth grade teacher in the experimental group. The first column
with names of students is deleted; column 2 recorded the scores of
students, from lowest score 8 to highest score 85. These scores from the
district pre-test at the beginning of the school year were the base for the
724

TABLE 1
Demographic information of teachers and students in experiment and control groups

Class size Education background # of math units Teaching experience

Experiment Control Experiment Control Experiment Control Experiment (years) Control (years)
Grade 5 B1 and A1 32 24 BA BA 7 6 5 6
Grade 6 B2 and A2 34 35 BA BS Math major 9 4 3
Grade 7 B3 and A3 31 33 BA BA 6 Math major 3 4
Grade 8 B4 and A4 32 32 BS BS 7–26 7 9 8
SHUHUA AN AND ZHONGHE WU
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 725

Figure 1. The sample of grading records

teacher grouping into three levels. Column 3 at the right end is instruction
by lesson number. Each day, the teacher selected four students from three
groups: one each from the high and low groups (see the shaded areas) and
two from the middle group (see unshaded area). For examples, for lesson
4.1 (column 3), the teachers selected students with scores 9 (low), 30
(middle), 32 (middle), and 75 (high) to grade and analyze their
homework. The teacher took turns selecting students to be graded on
their homework, and each student would have an equal chance to be
selected during the semester. This grading is substantial and allows
teachers to equally select students from every range of learning ability
and helps the teacher know students’ thinking at different levels with a
deep and internal understanding.
Teachers from the experimental group were also trained in the methods
of analyzing student error patterns. Each teacher randomly selected four
students from the three groups every day in order to grade their
homework, to analyze the errors in the homework, to make corrections
for misconceptions, to adjust their lessons according to students’
understanding as indicated from their homework, and to keep grading
logs (Table 2) in which they recorded and reflected on students’ thinking.
726

TABLE 2
Daily grading log for experimental group

Date: ________Instructor: ________ School: _______________Grade: ______ Topic: ______________

Correcting errors Action of correcting error

Student Errors Reasons Individual Whole class Other Instruction Activity Assessment
Low
Medium 1
Medium 2
SHUHUA AN AND ZHONGHE WU

High
Note: If more space is needed, please write on a separate sheet of paper
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 727

Teachers from the control group were trained separately on how to


record their normal ways of grading students’ homework in the author-
designed grading logs weekly (Table 3) in two categories: individual
grading (collecting homework) and whole group grading (not collecting
homework).
Both groups of teachers were assessed using pre- and post-question-
naires in pedagogical content knowledge. Both groups of students were
assessed using a pre-test and post-test. Ten teachers in both groups were
observed once in each semester for two semesters, to identify how they
implement the results of grading homework in teaching. They were also
interviewed at the end of each semester to confirm and clarify their
responses on both the pre- and post-questionnaires and to find out the
differences between the two groups in their understanding of students’
thinking.

Instrumentation
The teachers’ pre-questionnaire included four pedagogical content
knowledge problems about fractions, decimals, and percents (An et al.,
2004; An, 2000, 2004); the post-questionnaire consisted of problems
similar to the pre-questionnaire. The pedagogical content knowledge
problems focus on teachers’ knowledge of students’ cognition and how to
promote the growth of students’ thinking. Pre- and post-tests for students
were both from the school district. The pre-tests were provided at the
beginning of the school year to identify students’ backgrounds in
mathematics, which supplied information for teachers in the experimental
group for grouping their students. To determine the effects of grading
homework on students’ learning, students were assessed at the end of the
school year with the post-tests provided by the school district.
The teachers’ daily grading log for the experimental group was
designed by the authors and discussed with teachers to help them develop
their knowledge of students’ thinking (see Table 2). It was used to collect
data on teachers’ growing ability in analyzing error patterns. The daily
grading log included four sections: (1) a short listing of students’ errors,
(2) brief reasons for misunderstanding, (3) decisions about whether to
correct misconceptions individually or for the group, and (4) how the
results would be used. In this fourth column, the teacher is asked to
indicate whether he or she will address the misunderstanding through
instruction, through activity, or through another assessment. The daily
grading log for the control group was designed to collect data on how
teachers normally grade homework (see Table 3) in either of two ways: If
728

TABLE 3
Weekly grading log for control group

Instructor: Grade: School:

Approaches of grading

Individually grading Whole group grading

Dates Topic Collecting homework Not collecting homework

Give answers Give answers

Week 1 Effort Completion One by one Self check Group correcting Other
Day 1
Day 2
SHUHUA AN AND ZHONGHE WU

Day 3
Day 4
Day 5
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 729

they had “individual grading,” meaning they collected student homework,


they need to indicate whether they graded by effort, completion, or one by
one for each problem; if they had “whole group grading,” meaning they
did not collect student work, they need to tell whether they gave answers
while students self-checked or peer corrected, or other approaches.

Data Collection and Data Analysis


Both qualitative and quantitative methods were used in this study.
Employing qualitative methodology (Guba & Lincoln, 1994), data were
collected via two assessments of the pre- and post-questionnaires for
teachers and the pre- and post-tests for students in the school year. In
addition, data were collected from 20 classroom observations and 20
interviews during the school year. Furthermore, data were gathered from
teachers’ daily grading logs spanning two semesters. Four teachers’ data
sets from both experimental and control groups were used due to a rain-
damaged classroom that affected a folder of one teacher from the
experimental group. To analyze the teachers’ growth of understanding of
student thinking, a scoring rubric was used which evaluated the
experimental groups’ error analysis (see Table 4).
Transcripts were made for these observations and interviews by
examining field notes and grading logs from teachers and reviewing the
responses to the assessments to gather and assess the data. A constant
comparative data analysis method (Lincoln & Guba, 1985) was used in the
data analysis for questionnaires, observations, interviews, and daily grading
logs. The responses from teachers in the questionnaires and interviews were
coded, grouped, categorized, and compared. To better identify teachers in
two groups, teachers in the control group were represented as teacher A1,
A2, A3, A4, and A5; teachers in the experimental group were identified as
teacher B1, B2, B3, B4, and B5 in the data analysis.
A quantitative method was applied to analyze students’ pre- and post-
tests. The assessments for students were compared according to their
scores in fraction, decimal, percent, and proportion areas. The analysis of
students’ achievement and its relation to grading homework is reported in
another article.

RESULTS

The results of this study show that the mathematics teachers used an
assessing and analyzing students’ misconceptions approach to acquire
730

TABLE 4
Rubric for evaluating error analysis

1 = Does not meet


Criteria/levels expectations 2 = Meets some expectations 3 = Meets expectations 4 = Exceeds expectations
Identify error Does not identify errors Lists inappropriate errors Lists correct errors Lists one or more errors
Analyze reason for Does not analyze reason States inappropriate or Addresses procedural Addresses both procedural and
error for errors incomplete reasons reasons conceptual reasons
Design approach for Does not include Includes one method Includes two Includes multiple appropriate
correction method appropriate methods methods
Action for correction Does not include action Includes one method Includes two Includes multiple appropriate
for errors appropriate methods methods
SHUHUA AN AND ZHONGHE WU
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 731

knowledge of students’ thinking. Grading homework and analyzing errors


have enhanced teachers’ knowledge of students’ thinking and have built a
strong relationship in teachers’ pedagogical content knowledge between
knowledge of students’ thinking and analysis of errors.

Approaches to Assessment and Analysis of Students’ Misconceptions


The daily grading logs used by both teacher experimental and control
groups to record and analyze errors were analyzed. The different ways of
grading and analyzing errors and the frequencies of grading methods were
summarized in Tables 5 and 6.
The analysis of the daily grading logs in the control group shows that
all teachers used a variety of ways to grade student homework (see
Table 5). However, the major approach of grading homework in the
control group was limited to assigning grades for effort, completion, or
giving answers for self-checks. Only two teachers in the control group
graded individual student work less than ten times during the whole
school year, and one teacher used a group grading approach. For
example, teacher A4 graded homework by student effort 15 times, by
completion 115 times, one-by-one grading nine times, and providing
answers to student self-checks 29 times.

TABLE 5
Frequency of daily grading log from the control group

Approaches to grading

Individual grading Whole group grading

Collecting homework Not collecting homework

Give Give
answers answers

Teacher/grade One by Group


level Effort Completion one Self-check correct Other

Teacher A1 5th 10 10 61
Teacher A2 6th 52 52 43
Teacher A3 6th 76 76
Teacher A4 7th 15 115 9 29
Teacher A5 8th 25 6 11 17
Note: The numbers in the table are frequencies of grading homework
732 SHUHUA AN AND ZHONGHE WU

TABLE 6
Frequency of daily grading log for error analysis for the experimental group

Action of correcting errors (use


# of correcting errors the results of error analysis)
Teacher/ # of
grade errors Whole
level analysis Individual group Other Lesson Activity Assessment

Teacher 154 12 43 43 5 8
B1 5th
Teacher 86 7 19 14 11
B2 6th
Teacher 110 56 42 46 9 17
B3 6th
Teacher 105 99 34 38 7 34
B4 7th
and 8th
Teacher 97 38 56 24 30 15
B5 7th
The numbers in the table are the frequencies of ways of grading

Table 6 shows the data in daily grading logs from the experimental
teachers. In contrast to the control group, all teachers from the
experimental group not only graded samples of individual student
work but also analyzed students’ misconceptions from grading
homework daily and used the results of error analysis to plan the
next day’s instruction.
The results in Table 6 show that for the daily grading of the samples of
student homework, the teachers in the experimental group were able to
conduct daily error analysis more than 80 times during the study, with
four of them correcting errors frequently for the whole group and
integrating error correction in their instruction so the whole class could
benefit from it. For example, teacher B3 was able to correct individual
errors face to face more than 50 times and teacher B4 did individual grade
more than 90 times. In addition, the teachers in the experimental group
not only used results of error analysis in assessments to reinforce
understanding but also used error analysis in the lessons and activities to
enhance students’ learning from errors. For example, teacher B5 used the
results of correcting errors 30 times in activities, and teacher B3 used the
results of correcting errors 46 times in the instruction; teacher B4 used the
results of correcting errors 34 times in assessment to find the effects of
correcting errors. It is interesting to note that teachers at upper grade
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 733

levels were able to provide frequent help to individual students to correct


their errors.

Enhancing Knowledge of Students’ Thinking by Grading Homework


and Error Analysis
The results of the teachers’ error analysis show that through error analysis,
the teachers from the experimental group made obvious progress in their
knowledge of students’ thinking and understood the difficulties and
challenges their students had in learning mathematics from the processes
of the four steps of identifying errors, analyzing reasons for the errors,
designing approaches for correction, and taking action for correction. Table 7
shows the average scores of each step in fall and spring semesters, which
exhibit the progression of teachers’ knowledge of students’ thinking in the
course of two semesters of grading homework and analyzing errors. In
addition, the reduced standard of error in spring semester shows that the
scores of analyzing errors in four steps in spring semester were more
concentrated and less spread out than the scores in fall semester.
Figure 2 shows that all teachers improved their knowledge of student
thinking from error analysis. Among the four areas of identifying error,
analyzing reason for error, designing approach for correction, and taking
action for correction, the teachers improved the most in their skills of
analyzing reasons for errors, followed by taking action for correction.
Table 8 shows the breakdown of scores by teachers B1 to B4 in two
semesters and provides evidence of their growth from engaging in
grading and analyzing errors in the experimental group.

Improvement of Error Analysis from the Experimental Teachers


The following are examples of highlighting four teachers’ analyses from
the experimental group at different score levels in specific content areas.

TABLE 7
Scores of teachers’ knowledge of error analysis

Fall semester Spring semester

Criteria Mean Standard error Mean Standard error

Identify 2.875 0.187 3.650 0.071


Analyze 3.025 0.187 3.750 0.071
Design 2.633 0.215 3.467 0.081
Action 3.220 0.167 3.740 0.063
734 SHUHUA AN AND ZHONGHE WU

4
Identify
3.5
Analyze
3
2.5 Design
2 Action
1.5
1
0.5
0
TBFall TBSpring

Figure 2. Overall mean scores of teachers’ knowledge of error analysis in two semesters

These examples show that teachers are learners who catch students’
errors, inquire about students’ thinking, analyze patterns of errors, take
action to correct, use errors to reinforce understanding, and learn from
error analysis on a daily basis in a progressive process as the content level
increased from fall semester to spring semester. Their engagement in the
inquiry process of error analysis provided the evidence of their progress
and growth in knowledge of students’ thinking.

TEACHER B1: FROM LEVEL 1 TO LEVEL 4 ON ANALYZING ERRORS FOR FRACTIONS

Teacher B1, a fifth grade teacher with 5 years of teaching experience, did
the following error analysis for fractions. The examples of error analysis
show that teacher B1’s knowledge of students’ thinking has been
enhanced from scores 1 to 4 as more analysis work has been done.

Example of score 1: Add or subtract: 45  45.


1. Analysis of Cathy’s work:

Error: 45  45 ¼ 5.
Reason: I do not understand error.
Correction: Correct this misconception with the whole class.
Action: Address this misconception in instruction and assessment.

2. Analysis of Jack’s work:

Error: 45  45 ¼ 15.
Reason: Found 4−4=0. I am not sure why.
Correction: Correct this misconception with the whole class.
Action: Address this misconception in instruction and assessment.
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 735

TABLE 8
Mean scores of individual teachers’ knowledge of error analysis in two semesters

Teacher B1 Teacher B2 Teacher B3 Teacher B4

Fall Spring Fall Spring Fall Spring Fall Spring


Criteria mean mean mean mean mean mean mean mean

Identify 2.9 3.8 3.1 3.9 2.5 3.5 3.4 3.9


Analyze 2.7 3.6 3 3.8 2.4 3.3 3.2 3.8
Design 3.1 3.7 3.3 3.7 3 3.6 3.5 3.8
Action 2.8 3.5 2.7 3.6 2.3 3.5 3.7 3.7

Teacher B1 got a score of 1 for analyzing errors for the problem


5  5 for Cathy and Jack’s work. First, she “did not understand the
4 4

error” from Cathy’s work: 45  45 ¼ 5. Although she thought that Jack


might have calculated 4  4 ¼ 0, she “was not sure why.” Teacher
B1 did not know that these two students did not understand the
concepts of fractions, and they had difficulty in seeing a fraction as a
number that has the same properties in subtraction as a whole
number, and they were struggling with “zero” answers. For Cathy,
she seemed to think that when subtracting fractions with a common
denominator, the denominator stays the same, so when the top parts
4−4=0, there is only the denominator 5 left; therefore, 45  45 ¼ 5. For
Jack, he had the same difficulty in dealing with zero in the
numerator and thus did not know how to put 0 in the fraction and
produced the wrong solution 45  45 ¼ 15. Teachers with rich knowledge
of student thinking would know that if these two students understood
that the meaning of 45  45 is taking away four parts of the whole
from four parts of a whole, they would get 05 ¼ 0.
9
Example of score 2: Multiply or divide to find the missing number: 12 ¼ 3.

1. Error: 9
12 ¼ 3
.
21:
2. Reason: I think the student guessed by adding 9 and 12.
3. Correction: Correct this misconception with the student.
4. Action: Address this misconception in instruction.

Teacher B1 identified the possible error as being adding the numerator 9


to the denominator 12 to get 21 for the missing number. However, she did
not consider another possible misconception: The student followed the
instruction to divide 9 and 12 by the same number 3, resulting 34 ¼ 3 .
736 SHUHUA AN AND ZHONGHE WU

From that, the student might multiplied 3 and 4 to get 12, 3×3 to get 9,
and then added 12 to 9 to 21. Teacher B1 realized it was an individual
error, so she decided to correct this misconception with the student and
address it in instruction.

25  2 25.
Example of score 2: Add or subtract: 3 17 3

25  2 25 ¼ 1 25.
Errors: 3 17 3 19

Reasons: Added instead of subtracted.


Correction: Correct this misconception with the whole class.
Action: Address this misconception in instruction and assessment.

Teacher B1 got a score of 2 for analyzing reason for the error


because she only indicated one aspect of the error that the student
“added instead of subtracted” and ignored the incorrect result of
adding 17 and 3 to get 19. In addition, she did not point out whether
it was added with fraction parts or whole number parts. In fact the
students subtracted the whole numbers 3  2 ¼ 1 correctly. For
analysis of mixed numbers, teachers should analyze both fraction
and whole number parts and should require accuracy for the results.
Teacher B1 corrected this error with the whole class and addressed
this misconception in instruction and assessment.

Example of score 3: Add or subtract: 1 12 þ 12.

Errors: 1 12 þ 12 ¼ 1 22.
Reasons: Computer error. Did not find 22 ¼ 1 whole.
Correction: Correct this misconception with the student.
Action: Address this misconception in instruction.

Teacher B1 got a score of 3 for error analysis. She identified the correct error
as “did not find 22 ¼ 1 whole.” However, she could go further to consider that
the students might not understand the mixed number that is consisted of a
whole part and a fraction part, meaning 1 22 ¼ 1 þ 22 ¼ 1 þ 1 ¼ 2. Teacher
B1 corrected this error with the student and addressed this misconception in
instruction.
In the spring semester, when scoring students’ work from the review of
the end-of-course exam, teacher B1 analyzed a similar problem as the
example of score 2. This problem was harder because it did not have the
instruction on “multiply or divide to find the missing number” as previous
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 737

one. Students needed to figure out a way to find the missing number in
the following example:
Example of score 4: Find the missing number to make the fractions
equivalent: 69 ¼ 18.

Errors: 69 ¼ 54
18.
Reasons: Student multiplied numerator and denominator to find the
unknown value. Did not understand the equivalent concept.
Correction: Correct this misconception with the whole class.
Action: Address this misconception in instruction and activity.

Teacher B1 got a score of 4 for analyzing reasons for the error. Teacher
B1 realized that the student simply put 54 as the answer since that is the
product of numerator 6 and denominator 9 of the first fraction. She
identified the misconception as being with the student’s misunderstanding
of the concept of equivalence. When finding the missing number to make
the fractions equivalent, one should divide 18 by 9 to get the factor of 2
and at the same time to use the 2 to multiply 6 by, in order to have
equivalent fractions: 69 ¼ 62
92 ¼ 18. To correct this misconception and help
12

students understand how to create equivalent fractions, teacher B1


decided to bring it up with the whole class and to integrate this in
instruction and activity because “concepts such as the equivalence of
rational numbers develop very slowly over time” (Ashlock, 2006, p. 146).

TEACHER B2: FROM LEVEL 1 TO LEVEL 4 ON ANALYZING ERRORS FOR INTEGERS

Teacher B2, a sixth grade teacher with 4 years of teaching experience,


grew gradually in knowledge of student thinking from analyzing errors in
student work. The following analysis shows teacher B2’s progress in
analysis errors in integers:

Example of score 1: Compute: ð20 þ 10Þ þ ð6Þ.

Errors: ð20 þ 10Þ þ ð6Þ ¼ 30 þ 6 ¼ 24.


Reasons: Computation error. 30+−6 as −24.
Correction: Remediation after school.
Action: Address this misconception in warm-up activity.

Teacher B2 only recognized this error as a computational error and did not
think about the misconceptions in learning integer operations. The students
738 SHUHUA AN AND ZHONGHE WU

might get the correct answer 24 first as if it was whole number subtraction,
but since this problem was related to integer operation, the student added a
negative sign in the result. Teacher B2 corrected this error with the student
after school and addressed this misconception in the warm-up activity.

Example of score 2: Compute: −28÷0.

Errors: 28  0 ¼ 0.
Reasons: Did not fully understand that multiplication is the inverse
operation of division?
Correction: Correct this misconception with the whole class: When 0 is
the divisor, the quotient is undefined.
Action: Address this misconception in activity: Series of integer division
problems.

Teacher B2 had a score of 2 for analysis of this problem because he


addressed an inappropriate error; it is an unnecessary step. According to
his view, a correct way for solving this problem should change dividing
by 0 to multiplying by 10: 28  0 ¼ 28  10. All the four sampled
students had the same error. Teacher B2 did not see that his students did
not understand the concept of division fully. Even with adding this step,
students might still have the same error if they do not understand the
meaning of 0 as a divisor. The concept of division helps students
understand that −28÷0 means to use “an extremely small part,” like 0, to
find out how many such extremely small parts 0 are in −28. The number line
model will provide a meaningful way for students to see the infinite number
of such extremely small parts; therefore, the answer is undefined. Although
teacher B2 told the whole class that “when 0 is the divisor, the quotient is
undefined” during the correction process, a visual representation should be
used to help students understand this type of problems conceptually.

Example of score 3: Write a variable expression for the verbal phrase.


Twice a number increased by seven.

Errors: x2 +7
Reasons: Does not understand the meaning of exponents.
Correction: Correct this misconception with the whole class: review of
exponents.
Action: Address this misconception in instruction.
Teacher B2 had a score of 3 for analyzing this error. He identified the
error as not understanding the meaning of exponents. However, teacher
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 739

B2 did not indicate the possible difficulty on word “twice.” Here, the
students confused about “twice a number.” It means double, not meaning
exponent. Teacher B2 corrected this misconception with the whole class
and addressed this misconception in instruction.

Example of score 4: Complete the statement with add, subtract, multiply,


or divide, explain your choice of operation:
This year the rainfall is 1.5 inches less than last year. To find out how
much rain fell last year, you_________

Errors: “Subtract because you find out what is less you subtract.”
Reasons: Misunderstanding of the situation and misunderstanding of how
to use key terms.
Correction: Correct this misconception with the whole class.
Action: Address this misconception in instruction and assessment.

Teacher B2 knew that the student did not understand the question, thus did
not know how to use the key term such as “less than” in a different situation.
Usually, some teachers tell students to subtract if you see words “less than”
and to add if you see words “more than.” If students learn these words without
understanding, they will ignore the context of situation and use these words as
a general rule for integer operations. To correct students’ misconception in
using these key terms in integer application, teacher B2 made the correction
for the whole class and used it in both instruction and assessment.

TEACHER B3: FROM LEVEL 2 TO LEVEL 4 ON ANALYZING ERRORS FOR PERCENTS

Teacher B3, a seventh grade teacher with 3 years of teaching experience,


conducted the following error analysis on percents during the experimen-
tal period:

Example of score 2: Write the decimal as a percent: 0.07.

Errors: 0.0007%.
Reasons: Using the shortcut for conversion improperly.
Correction: Correct this misconception with the whole class: review of
shortcut.
Action: Address this misconception in instruction.
740 SHUHUA AN AND ZHONGHE WU

Teacher B3 got a score of 2 for analyzing this error because he did not
provide the complete reasons for the error. One reason teacher B3 concluded
is that the student used “the shortcut for conversion improperly.” The
shortcut is the computational technique that tells students to move decimal
points two places to the right if changing a decimal to a percent, such as
0.07=7%. However, teacher B3 was not aware that the student might not
understand the meaning of decimal and its relationship with percent.
The decimal 0.07 means seven hundredths, i.e., seven squares in a
7
hundred decimal grid, which is equivalent to 100 that is 7%.

Example of score 3: Write the fraction 91/50 as a percent.


Errors: 91/50=1.82%.
Reasons: Left decimal in percentage.
Correction: Correct this misconception with the whole class.
Action: Address this misconception in activity.
Teacher B3 got a score of 3 for analyzing this error because he pointed out
the procedure error correctly. He understood that the students did not move
the decimal point to the right when changing the fraction 91/50 to 1.82,
which was converted to the percent of 1.82%. However, he did not analyze
the possible misconception in a relationship between a fraction and a percent,
i.e., change 91 182
50 to 100 first, and then move the decimal point to left two places
from 182% to 1.82%. The error also might have been the student’s poor
number sense on percent. With good number sense with rational numbers,
the student would see 91950, so 91/50 should be more than a whole that is
100%. By understanding the result is more than 100%, the student would get
the correct solution: 91=50 ¼ 182%. Considering it was a common error and
an important number skill, teacher B2 made the correction for the whole
class and used it in the activity.

Example of score 4: Use proportion to answer the question: The news


media reported that 6,435 votes were cast in the last election and that this
represented 65% of the eligible votes of that district. How many eligible
votes are in the district?
Errors: 65% of 6; 435 ¼ ð0:65Þð6; 435Þ ¼ 4; 182:76
þ6; 435
10; 617:76
Reasons: Looking for 65% of 6,435. She did not understand the question.
65
Should use this instead: 100 ¼ 6;435
x .
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 741

Correction: Correct this misconception with the whole class.


Action: Address this misconception in instruction and assessment.
Teacher B3 got a score of 4 for analyzing this error. He identified the
error as coming from the student not understanding the question and
resulted in looking for 65% of 6,435. The question asked for the total of
votes that represents 100% of eligible votes; however, 6,435 is the
number of the votes that were cast in the last election, and it is only 65%
of the total eligible votes. Teacher B3 suggested using the proportion
65
instead: 100 ¼ 6;435
x , which is a good way to solve percent related problems
conceptually. Teacher B3 made this correction for the whole class and
used it in instruction and assessment.

TEACHER B4: FROM LEVEL 2 TO LEVEL 4 ON ANALYZING ERRORS FOR POWER


OFPRODUCTS

Teacher B4, a seventh and eighth grade teacher with 9 years of teaching
experience, conducted the following error analysis during the experimen-
tal period:

Example of score 2: Simplifying expression: 25−(3·2)2.

Errors: 25  ð32Þ2 ¼ 25  ð34Þ.


Reasons: Does not complete operations within parentheses first.
Correction: Correct this misconception with the individual and the whole
class.
Action: Address this misconception in instruction and assessment.

Teacher B4 had a score of 2 for analyzing the errors for this problem
because he believed that the students did not complete operations within
parentheses first. However, teacher B3 could not identify the student’s
misconception accurately on the misuse of the power of product rule by
raising the factor 2 in the parentheses by the power of 2. For this problem,
it is not necessary to do operations within parentheses first. The problem
can be solved in this way: 5  ð32Þ2 ¼ 25  ð3Þ2 ð2Þ2 ¼ 25  9  4. In
addition, the student might think that the exponent 2 is for the 2 in the
parentheses only because 2 is the closest number to the exponent 2, so
5  ð32Þ2 ¼ 25  322 ¼ 25  ð34Þ. Teacher B4 noticed this type of
error pattern as a critical skill in learning early algebra; he made a
decision to correct it for both individuals and the whole class. To review
742 SHUHUA AN AND ZHONGHE WU

and reinforce this skill, teacher B4 added it in his instruction and


assessment.

Example of score 3: Evaluate (−5)3.

Errors: ð5Þ3 ¼ 555 ¼ 125:


Reasons: Left out negative signs. Did not understand ð5Þ3 ¼
5  5  5 ¼ 125.
Correction: Correct the error with the individual student.
Action: Address this misconception in instruction, activity, and assessment.
Teacher B4 received a score of 3 for this analysis. He indicated a reason of
the error relating to “left out negative signs.” In addition, he also found that
the student “did not understand ð5Þ3 ¼ 5  5  5 ¼ 125.” But
teacher B3 did not indicate that the student also did not understand the
concept of the power of products. In (−5)3, −5 is a base, and (−5)3 means −5
to the third power. To evaluate (−5)3, the student needs to write the base −5
in a product 3 times and then multiply. Teacher B4 knew that the power of
products involving integers is very important for students’ continuous
learning, so he corrected the error individually and embraced it in instruction,
activity, and assessment.

2c 2
Example of score 4: Simplify: 7d .

2c 2 2
Error: 7d ¼ 272 ¼ 49
4
.
Reasons: Understood concept in relation to numbers, but not variables.
Left out variables. Did not raise variables to specified power.
Correction: Correct the error with the individual student and whole class
correction.
Action: Address it in instruction.
Teacher B4 had a score of 4 for analyzing this error. He
recognized that the student understood the concept of power in
2 2
relation to numbers because the student knew that 27 ¼ 272 ¼ 49 4
;
however, when the power has numbers and variables together, the
student did not raise the variables to a specified power and left out
the variables. This common error addresses the disconnection between
numbers and variables and shows students’ difficulties in transferring
learning from numbers to abstract algebra at the beginning of
learning algebra. To help students understand better about powers
related to variables, teacher B4 made a correction for both individuals
and the whole class and used it in instruction.
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 743

The above examples of error analysis show that the teachers’ levels of
error analysis moved from an improper or lower level to an appropriate high
level as they conducted more grading and analyzed students’ work
throughout the course of the project. The teachers were able to identify the
appropriate reasons for errors and provide multiple strategies that included
correcting misconceptions for individuals and the whole class and addressing
these results in their instruction, activity, and assessment so their students
would be aware of these errors and learn from them. The process of grading,
analyzing, correcting, and using errors provided opportunities for the
teachers to understand different types of student thinking on specific content
areas and to understand difficulties and challenges students might face in
learning mathematics. As the result of engaging in this learning process, the
teachers enhanced their knowledge of student thinking, evidenced by the
above analysis, and from the discussion of their PCK in their surveys below.

Effects of Grading Homework on the Growth of Pedagogical Content


Knowledge
The results from comparing teachers’ responses on post-questionnaires show
that the teachers in the experimental group were able to identify students’
errors and make corrections for misconceptions using various strategies such
as pictures or tables, compared to their responses in the pre-questionnaires.
Furthermore, the teachers in the experimental group were able to provide
questions that focused on promoting and supporting students’ thinking in
their post-questionnaires, compared to their responses in the pre-question-
naires. The following are the samples of teachers’ responses on four the PCK
problems in the pre-and post- questionnaires from the experimental group
(see “APPENDIX”: four PCK problems).

Problem 1. Teacher B2 answered problem 1, part b in the pre-questionnaires:

Pre-Questionnaire: I am not quite sure of what I would ask him. I suppose that I would
have him draw fraction models. We would go over how the addition of fractions can be
taught as the addition of the area of the models. We would then move this discussion
toward common denominators.

After engaging in grading students’ homework, teacher B2 answered in


the post-questionnaire:

Post-Questionnaire: I would have George draw a picture of 3/5, 1/2, and 2/3. I would
then ask him if it looked possible for the area of 2/3 to be the results of removing 1/2
from 3/5.
744 SHUHUA AN AND ZHONGHE WU

The second response from teacher B2 appeared to be more detailed and


specific in helping George correct misconceptions on the fraction
subtraction with uncommon denominators.
Problem 2. In the pre-questionnaire, teacher B3 created the list of
fractions (1/4, 1/3, 1/8) with the same numerator and had Latoya put it in
order. By asking probing questions, such as “Why did you order them in
this way?,” teacher B3 believed that he could determine Latoya’s thought
process. In the post-questionnaire, teacher B3 developed a finer way to
access students’ thinking by using 1/4 as a reference number; teacher B3
stated it in the following:
I would ask Linda to add ¼ to the fractions and put it in its proper location. If she thinks
large fractions have large denominators, she will put 1/4 between 4/5 & 2/3–5/11, 4/5, 1/4,
2/3. If she puts 1/4 last—5/11, 4/5, 2/3, 1/4, then she is using the numerators to compare
the fractions.

Problem 3. The following shows teacher B1’s growth of knowledge


of students’ thinking in problem 3, part (a) between the pre- and
post-questionnaires:

Pre-Questionnaire: Give them a scenario, for example: Your sister’s birthday was last
weekend. 13 of her birthday cake was left over. Your mom tells you to divide
the leftover into fourths and take each piece to one of the neighbors.
Post-questionnaire:

1. Give students a strip of construction paper and have them fold it in half.
2. Have students shade half.
3. Unfold paper and fold again, but into thirds. Trace over the fold lines.
4. Students count the number of thirds in 1/2. They will count 1 12 thirds in 12.

In the pre-questionnaire, teacher B1 connected multiplication


fractions to real life experience. However, teacher B1 did not address
how to interpret 1/3×1/4=1/12 using this introductory activity. In
addition, teacher B1 did not directly address the original problem 3/
4×2/3. In contrast, teacher B1 used a meaningful introductory activity
of paper folding to engage students in and motivate them to learn
fraction division with understanding in the post-questionnaire (see
Figure 3).
Problem 4. Teacher B4 recognized in pre-questionnaire, “In the past,
when I taught proportions, I did not use any activity. For the most part,
we would go over the plug and chug method (cross multiplication to
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 745

1 1
÷
2 3
Figure 3. Fraction division

solve for the variable).” In the post-questionnaire, teacher B4 developed a


group activity: “I would have students try to solve the problem together in
partners. This is before I even lecture. After they are done, each group
would present their method of solving.” Furthermore, he indicated how to
help Larry avoid errors:

For Larry, I would have him label the numbers in each ratio. This might help him
determine how to set up each ratio in the Proportion. Right now, he is using the right
numbers but they are not set up properly.

Labeling each part in setting up a proportion is recognized as an effective


visual method when solving a proportion problem (An, 2000, 2004).
To encourage students to reflect on their answer and solution, teacher
B2 pointed out the strategy of looking back:

To teach them the 4 steps in problem solving. Understand the question, pick a strategy, solve,
and look back. I would emphasize looking back by giving students solutions to problems that
may or may not be corrected. They would be asked to analyze these solutions.

In summary, the results in this study show that by grading homework and
analyzing errors, teachers not only gained in their understanding of
students’ thinking of mathematics but also enhanced their PCK.

DISCUSSION

Approaches to Acquiring Knowledge of Students’ Thinking


The results of this study indicate that analyzing errors from grading
homework is an important approach to acquiring knowledge of students’
746 SHUHUA AN AND ZHONGHE WU

thinking. It shows that teachers can provide more and better feedback to
students when they use the sampling strategy and Daily Grading Log
addressed in this study for analyzing student work. This can replace the
apprehension of “no time” grading homework practice in the USA. The
evidence from this study shows that using the strategy from this study, of
grading four selected samples of homework daily at three levels, teachers
will be able to identify students’ understanding on a daily basis. In
addition, by doing homework, students engage in the inquiry process of
recall, review, and understanding of the concepts and skills learned from
the classroom and apply them to solve problems independently (Butler,
2001). In this process, students internalize their knowledge and skills and
transfer them from others to their own knowledge. Once students have the
“ownership” of the knowledge, their homework reflects their thinking of
mathematics at an internal level. Therefore, grading students’ homework
also provides teachers an opportunity to acquire knowledge of students’
thinking at a deeper level.
Although “it is critical for students to complete homework to
internalize and make their solution/strategy personalized (your
ownership)” (teacher B4), homework must be monitored and
followed up on (NRC, 2001; Butler, 2001). Through grading
homework, teachers monitor and follow up on students’ understand-
ing, which improves the effectiveness of their teaching. The grading
homework allows teachers to review and solidify material being
covered in class (Corno & Xu, 2004), which helps teachers make
the right instructional decision that takes into consideration students’
needs based on the analysis of errors from students’ homework. In
addition, it allows teachers to have a focus on their instruction and
develop better ways to teach effectively (Borasi, 1994). Further, it
also helps students master knowledge with understanding. The
results of this study provide strong evidence that grading homework
and analyzing errors indeed enhanced teachers’ learning of students’
thinking.

The Effects of Analyzing Errors on Teachers Learning of Students’


Thinking
The results of this study provide strong evidence that analyzing
errors plays a vital role in successful teacher learning and
understanding of students’ thinking. According to Ashlock (1994),
“Errors are a positive thing in the process of learning—or at least
they should be. In many cultures, errors are regarded as an
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 747

opportunity to reflect and learn” (p. 1) and as springboard for


inquiry (Borasi, 1994). Through error analysis in this study, teachers
were able to progress in their knowledge of students’ thinking and
found patterns of errors. For example, teacher B3 identified the error
of 2 34 1 15 ¼ 2 20
3
: “Did not change mixed numbers into improper
fraction.” He decided to make a correction with the whole class and
integrate it in instruction, activity, and assessment to reinforce the
correct way of solving this type of problem. Research from the NRC
(2001) supports the statement “effective instruction with rational
numbers needs to take these errors into account” (p. 238). As shown
in their grading logs, all teachers from the experimental group in
this study were able to integrate error patterns in their teaching
practices.
The process of error analysis could contribute to progress in
developing efficient teaching strategies if time is spent examining
how and why students make mistakes and in making corrections for
the errors immediately (Pajares, 2002). Findings from this study
show that such analysis can promote understanding of students’
thinking by revealing the error patterns and can enhance teachers’
knowledge of students’ thinking. In addition, the process of error
analysis not only facilitates teachers’ understanding of students’
thinking but also helps students in active learning. It serves the role
of nipping the “blind point” in the bud. Students can avoid making
mistakes if the teacher can help them realize and eliminate errors at
an early stage (An, 2004), so students will not repeat the errors.
With less repetition of errors, students will firmly master the
concepts and skills at proficiency level. Furthermore, the process
of error analysis provides a clear mirror through which teachers are
able to reflect and evaluate their teaching effectiveness, and this
reflection allows teachers to identify their strengths and weaknesses
in teaching specific content areas and build their experience and
confidence in their teaching.

The Impact of Grading Homework on the Growth of Pedagogical Content


Knowledge
The important features of PCK are understanding the level of
difficulty of a particular concept and the prior knowledge students
usually bring to that topic (Shulman, 1986) and knowing students’
thinking (An et al., 2004). To teach mathematics effectively, a
teacher should acquire a solid PCK. This study explored and confirmed
748 SHUHUA AN AND ZHONGHE WU

one of the gateways of acquiring PCK—grading homework, through which


teachers identify errors, analyze errors, make corrections for misconceptions,
and decide how to integrate error analysis in their instruction, activity, and
assessment. By engaging in this inquiry process, teachers transfer
their content and pedagogical knowledge into a powerful pedagogical
content knowledge, which focuses on students’ thinking of mathe-
matics (An, 2000). This type of knowledge is difficult to obtain from
college courses, or professional development; it can be acquired
meaningfully through daily teaching practice. According to Corno
(2000), grading homework gives teachers a sense of how well their
students understand the lesson and where the problems are, and
especially it provides opportunities for teachers to tailor their
feedback to individual errors. It takes time and effort to accumulate
and solidify knowledge of students’ thinking. The evidence from this
study shows that grading homework is one important avenue to gain
knowledge of students’ thinking, an important part of PCK.

Implementing the New Model of Grading Homework in Daily Teaching


This study shows that to really understand students’ thinking at a
deeper and internal level, teachers should grade students’ homework
and analyze errors from homework. The most important contribution
from this study was that it provided a sample of a new model of
grading homework that is feasible and practical for US classroom
teachers. This model of grading student homework helps to solve the
dilemma in grading homework practice in the US. Considering US
teachers do not have sufficient time to grade all students’ homework
daily, this study suggests grouping students into three levels of low,
middle, and high and selecting one or two students’ homework from
each group to grade on a daily basis. In addition, teachers should
engage in an error analysis process daily by identifying error patterns
and analyzing the possible reasons for the misconception. To have
students master the concepts and skills correctly and accurately,
teachers must provide feedback immediately by making corrections
for errors, which is also a key aspect to improving self-efficacy
(Pajares, 2002). Furthermore, to help students reinforce their
knowledge and achieve proficiency in mathematics, teachers must
assign homework that strengthens the targeted skills and knowledge
(Coutts, 2004) and then integrate the error patterns in instruction,
activity, and assessment to enable students to learn from errors
(Empson & Junk, 2004; An, 2004).
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 749

EDUCATIONAL SIGNIFICANCE

This study tested the proposition that grading homework can become
a more powerful tool for teaching if teachers are provided with an
opportunity to be trained to use an effective way to carefully grade
homework and analyze it for insights into how students are making
sense of instruction. Homework plays an increasingly vital role in
improving student achievement (Cooper & Valentine, 2001). Howev-
er, homework is multifaceted, and the effects of it increase with
planning and preparation for homework (Corno & Xu, 2004). The
challenge for teachers is to assign homework that strengthens the
targeted skills and knowledge, but in a way that is relevant to
students (Coutts, 2004).
To achieve this task effectively, a teacher must know students’
thinking (An et al., 2004; Empson & Junk, 2004; Philipp, Clement,
Thanheiser, Schappelle & Sowder, 2003; Schifter, 2001). The
findings from this study confirmed the importance of building
knowledge of students’ thinking and provided a means for teachers
to learn and acquire the knowledge of students’ thinking from
grading daily homework and analyzing errors in students’ work. In
addition, this study examined how well teachers understand students’
thinking by grading homework and analyzing errors. The results of
this study show that grading homework helps teachers learn
knowledge of students’ thinking and that analyzing errors in
homework develops in teachers a deeper understanding of students’
thinking; in turn, the knowledge of students’ thinking will contribute
to a transition in teachers’ beliefs, profound PCK, and more effective
teaching, which leads to increased success in improving students’
learning. To improve US mathematics teaching significantly, this
study suggests classroom teachers integrate this new model into their
teaching practice and suggests researchers pay more attention to the
effects of grading homework and analyzing errors.

ACKNOWLEDGMENTS

The research reported in this study was supported by the American


Educational Research Association Grants Program under an AERA-OERI
Grant. The opinions expressed in this study are those of the authors and
do not necessarily reflect the views of the American Educational Research
Association.
750 SHUHUA AN AND ZHONGHE WU

APPENDIX
Four Teachers’ PCK Problems
Problem 1: Teachers’ PCK on Fraction Addition and Subtraction.

Pre-Questionnaire Post-Questionnaire
Adam is a 10-year-old student in 5th grade who has average George is a 10-year-old student in 5th grade who has
ability. His grade on the last test was an 82 percent. Look average ability. His grade on the last test was an 82
at Adam’s written work for these problems: percent. Look at George’s solution for this problem:

3 4 7 3 1 2
+ = - =
4 5 9 5 2 3
a. What prerequisite knowledge might Adam not be
understand or be forgetting? a. What prerequisite knowledge might George not
b. What questions or tasks would you ask of Adam in order understand or be forgetting?
b. What questions or tasks would you ask George in
to determine what he understands about the meaning of
order to determine what he understands about the
fraction addition?
c. What real world example of fractions is Adam likely to meaning of fraction subtraction?
be familiar with that you could use to help him? c. What real world example of fractions is George
likely to be familiar with that you could use to help
him?

Problem 2: Teachers’ PCK on Ordering Fractions.

Pre-Questionnaire Post- Questionnaire


A fifth-grade teacher asked her students to write the A fifth-grade teacher asked her students to write the
following three numbers in order, from smallest to following three numbers in order from largest to smallest:
largest: 5 4 2
3 1 2 , ,
, , 11 5 3
8 4 3 Linda, Juan, and Donna placed them in order as follows.
Latoya, Robert, and Sandra placed them in order as Linda:
follows. 5 4 2
Latoya: , ,
11 5 3
1 2 3 Juan:
, ,
4 3 8 2 4 5
, ,
Robert: 3 5 11
2 1 3
, ,
3 4 8 Donna:
4 2 5
, ,
Sandra: 5 3 11
1 3 2
, , a. What might each of the students be thinking?
4 8 3
b. What question would you ask Linda to find out if your
opinion of her thinking is correct?
a. What might each of the students be thinking?
c. How would you correct Juan’s misconception about
b. What question would you ask Latoya to find out if
comparing the size of fractions?
your opinion of her thinking is correct?
c. How would you correct Robert’s misconception about
comparing the size of fractions?
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 751

Problem 3: Teachers’ PCK on Fraction Multiplication and Division.

Pre-Questionnaire Post-Questionnaire
3. You are planning to teach procedures for doing the 3. You are planning to teach the following type of
following types of fraction multiplication. fraction division.
3 2 6 1 1
´ = ¸
4 3 12 2 3
a. Describe an introductory activity that would engage a. Describe the meaning of the above fraction division.
and motivate your students to learn this procedure. b. Describe an introductory activity that would engage
b. Multiplication can be represented by repeated addition, and motivate your students to learn fraction division.
by area, or by combinations. c. Fraction division can be solved by multiplication, or
Which one of these representations would you use to by graphs (such as a circle graph).
illustrate fraction multiplication to your students? Why? Which one of these methods would you use to
c. Describe an activity that would help your students illustrate fraction division to your students? Why?
understand the procedure of multiplying fractions. d. Describe an activity that would help your students
understand how to solve fraction division problems.

Problem 4: Teachers’ PCK on Proportions.


Pre-Questionnaire Post-Questionnaire
Your students are trying to solve the following proportion Your students are trying to solve the following problem
problem: The ratio of girls to boys in Math club is 3:5. If using proportion: Sara earned some money from working
there are 40 students in the Math club, how many are during the summer. She spent 40% of the money on
boys? books. The books cost $80. How much money did she
a. Describe an activity that you would use to determine earn from working in the summer?
the types of solution strategies your students have used to a. Describe an activity that you would use to determine
solve the problem. the types of solution strategies your students have used to
Here are two students’ solutions to the problem: solve the problem.
Here are two students’ solutions to the problem:

Juan’s solution: Larry’s solution:


3 x 40 x
= =
5 40 80 100

There are 24 girls, so there are 16 boys. x = $50. $50 + $80 = $130. Sara earned $130.

Steve’s solution:
Kathy’s solution: 40 80
3 x =
= 60 x
8 40
x = $120. Sara eared $120 in the summer.
There are 15 boys.
b. What suggestion would you provide to Larry that might
b. What question would you ask Kathy to determine if she help him revise his approach?
could justify her answer and reasoning? c. What question would you ask Steve to determine if he
c. What suggestion would you provide to June that might could justify his answer and reasoning?
help her revise her approach?

REFERENCES

An, S. (2000). A comparative study of mathematics programs in the U.S. and China: The
pedagogical content knowledge of middle school mathematics in the U.S. and China.
Doctoral Dissertation, Texas A&M University, College Station, TX.
An, S. (2004). The middle path in math instruction: Solutions for improving math
education. Lanham, MD: Scarecrow Education.
752 SHUHUA AN AND ZHONGHE WU

An, S., Kulm, G. & Wu, Z. (2004). The pedagogical content knowledge of middle school
mathematics teachers in China and the U.S. Journal of Mathematics Teacher Education,
7, 145–172.
An, S. & Wu, Z. (2008). Approaches to assessing students’ thinking from analyzing errors
in homework. In C. E. Malloy (Ed.), Mathematics for every student: Responding to
diversity, grades 6–8. Reston, VA: NCTM.
Andrewa, G. R. & Debus, R. L. (1978). Persistence and the causal perception of
failure: Modifying cognitive attributions. Journal of Educational Psychology, 70,
154–166.
Ashlock, R. B. (1994). Error patterns in computation. New York: Macmillan.
Austin, J. (1976). Do comments on mathematics homework affect student achievement?
School Science and Mathematics, 76, 159–164.
Bandura, A. (1977). Self efficacy: Towards a theory of behavior change. Psychological
Review, 89, 191–215.
Borasi, R. (1994). Capitalizing on errors as “springboards for inquiry”: A teaching
experiment. Journal for Research in Mathematics Education, 25(2), 166–208.
Butler, J.A. (2001). Homework. Northwest Regional Educational Laboratory Web Site.
Retrieved June 14, 2006, from http://www.nwrel.org/scpd/sirs/1/cu1.html.
Cooney, T. J. & Shealy, B. E. (1997). On understanding the structure of teachers’ beliefs
and their relationship to change. In E. Fennema & B. S. Nelson (Eds.), Mathematics
teachers in transition (pp. 87–109). Mahwah, NJ: Lawrence Erlbaum.
Cooper, H. M. (1989a). Homework. New York: Longman.
Cooper, H. M. (1989b). Synthesis of research on homework. Educational Leadership, 47
(3), 85–91.
Cooper, H. M. & Valentine, J. C. (2001). Using research to answer practical questions
about homework. Educational Psychologist, 36, 143–153.
Corno, L. (2000). Looking at homework differently. The Elementary School Journal, 100
(5), 529–548. Special Issue: Non-Subject-Matter Outcomes of Schooling [II]. (May,
2000).
Corno, L. & Xu, J. (2004). Homework as the job of childhood. Theory Into Practice, 43
(3), 227–233.
Coulter, F. (1979). Homework: A neglected research area. British Educational Research
Journal, 5(1), 21–33.
Coutts, P. (2004). Meanings of homework and implications for practice. Theory Into
Practice, 43(3), 182–188.
Empson, S. B. & Junk, D. L. (2004). Teachers’ knowledge of children’s mathematics after
implementing a student centered curriculum. Journal of Mathematics Teacher
Education, 7, 121–144.
Falkner, K. P., Levi, L. & Carpenter, T. P. (1999). Children’s understanding of equality: A
foundation for algebra. Teaching Children Mathematics, 6, 232–236.
Fennema, E., Carpenter, T. P. & Lamon, S. J. (1991). Integrating research on teaching
and learning mathematics. Albany, NY: State University of New York Press.
Fennema, E. & Franke, M. L. (1992). Teachers knowledge and its impact. In D. A.
Grouws (Ed.), Handbook of mathematics teaching and learning (pp. 147–164). New
York: Macmillan.
Gagne, E. D., Crutcherm, R. L., Joella, A., Geisman, C., Hoffman, V. D., Schutz, P. &
Lizcano, L. (1987). The role of student processing of feedback in classroom
achievement. Cognition and Instruction, 4(3), 167–186.
KNOWLEDGE OF STUDENTS’ THINKING: GRADING HOMEWORK 753

Graeber, A. O. (1999). Forms of knowing mathematics: What preservice teachers should


learn. In D. Tirosh (Ed.), Forms of mathematical knowledge. Boston: Kluwer Academic.
Guba, E. G. & Lincoln, Y. S. (1994). Competing paradigms in qualitative research. In N. K.
Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research. Thousand Oaks, CA: Sage.
Hill, H. C., Rowan, B. & Ball, D. L. (2005). Effects of teachers’ mathematical knowledge for
teaching on student achievement. American Educational Research Journal, 42(2), 388.
Lincoln, Y. S. & Guba, E. G. (1985). Naturalistic inquiry. Beverly Hills, CA: Sage.
National Council of Teachers of Mathematics (NCTM). (2000). Principles and standards
for school mathematics. Reston, VA: NCTM.
National Research Council (2001). Adding it up: Helping children learn mathematics. In J.
Kilpatrick, J. Swafford & B. Findell (Eds.), Mathematics learning Study Committee,
Center for education, Division of Behavioral and Social Sciences and Education.
Washington, DC: National Academy Press.
Pajares, F. (2002). Gender and perceived self-efficacy in self-regulated learning. Theory
Into Practice, 41(2), 116–125.
Paschal, R. A., Weinstein, T. & Walberg, H. J. (1984). The effects of homework on
learning: A quantitative synthesis. The Journal of Educational Research, 78(2), 97–104.
Philipp, R. A., Clement, L.L., Thanheiser, E., Schappelle, B. & Sowder, J.T. (2003). Integrating
mathematics and pedagogy: An investigation of the effects on elementary preservice
teachers’ beliefs and learning of mathematics. Paper presented at the research presession of
the annual meeting of the National Council of Teachers of Mathematics, San Antonio, TX.
Rumelhart, D. E. (1980). Schemata: The building blocks of cognition. In R. J. Spiro, B. C.
Bruce & W. F. Brewer (Eds.), Theoretical issues in reading comprehension (pp. 35–
58). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.
Schifter, D. (2001). Learning mathematics for teaching: From a teachers’ seminar to the
classroom. Journal of Mathematics Teacher Education, 1(1), 55–87.
Schoen, H. L. & Kreye, B. C. (1974). Five forms of written feedback to homework in a
mathematics course for elementary teachers. Journal for Research in Mathematics
Education, 5(3), 140–145.
Shulman, L. (1986). Those who understand: Knowledge growth in teaching. Educational
Researcher, 15(2), 4–14.
Sowder, J., Philipp, R., Armstrong, B. & Schappelle, B. (1998). Middle-grade teachers’
mathematical knowledge and its relationship to instruction. Albany, NY: SUNY.
Stiggins, R. J., Frisbie, D. A. & Griswold, P. A. (1989). Inside high school grading practices:
Building a research agenda. Educational Measurement: Issues and Practice, 8(2), 5–14.
Wu, Z. (2004). The study of middle school teachers’ understanding and use of
mathematical representation in relationship to teachers’ Zone of Proximal Development
in teaching fractions and algebraic functions. Dissertation in Texas A&M University.

Shuhua An
California State University, Long Beach
1250 Bellflower Boulevard, Long Beach, CA 90840-2201, USA
E-mail: [email protected]

Zhonghe Wu
National University
3390 Harbor Blvd, Costa Mesa, CA, 92626-1502, USA
E-mail: [email protected]

You might also like