Professional learning opportunities for teachers of mathematics and science have increasingly focused on teachers’ content knowledge. Strategies aimed at deepening teachers’ disciplinary and pedagogical content knowledge often include attention to student thinking, including analyzing student work and/or dialogue and opportunities to learn about common student ideas and misconceptions. Advice from experienced practitioners offers guidance for efforts to engage teachers in considering student thinking as a strategy for deepening their disciplinary and pedagogical content knowledge. Insights provided by a group of expert practitioners with diverse backgrounds and experiences in working with teachers included the following ideas:
- Not any student work will do—Selection of appropriate work samples makes a difference in giving teachers an opportunity to deepen their disciplinary and pedagogical content knowledge.
- Learn, then apply—Teachers should first analyze a carefully crafted set of student work before analyzing their own students’ work.
- Focus on the positive—Analysis of student work should include a focus on what students understand, not just what they do not understand.
- Which comes first?—Practitioners have different perspectives on the level of teachers’ content knowledge needed prior to the analysis of student understanding.
- Bring content in through the back door—Analysis of assessment items provides opportunities to deepen teachers’ understanding of mathematics/science content.
Practitioner Insights
Professional development programs often utilize strategies which focus teachers on student thinking. Among the strategies recommended by program leaders were analysis of students’ written work, analysis of student talk, and consideration of research on student misconceptions. Such strategies can help develop teachers’ pedagogical content knowledge and also can provide a springboard to deepen teachers’ understanding of mathematics/science ideas, offering a safe opportunity for teachers of varying content understanding to access the content. As one program leader noted, focusing on student thinking allows “a chance to see that there is much to know about the content for all of us and everyone has something to offer. It reduces some of the anxiety from the ‘doing content’ aspect of professional development.”
Analysis of student artifacts provides multiple entry points for teachers to discuss student thinking about content. In addition, program leaders noted that teachers see this type of professional development as highly relevant to classroom practice. As one MSP CO-PI shared,
Looking at student work, listening to student interviews, and observing students during instruction are powerful vehicles for gaining insight into how students think. It is remarkable how little time teachers spend examining student work, carefully observing students, or monitoring student dialogue. All too often teachers focus more on what they themselves are doing rather than what students are doing. Collecting data in this way can often provide the evidence needed to demonstrate that students really aren’t mastering ideas we assumed they knew. It can also provide a way into viewing those commonly-held naive or misconceptions that we thought we had corrected.
When queried about using strategies that focus on student thinking for deepening teachers’ disciplinary and pedagogical content knowledge, experienced practitioners offered a number of insights, which are described below. After reviewing these insights, you will be provided with opportunities to share your own experiences with using these strategies for these purposes. The information you provide will be analyzed along with the insights and examples from other practitioners as this website is periodically updated.
Not any student work will do—Selection of appropriate work samples makes a difference in giving teachers an opportunity to deepen their disciplinary and pedagogical content knowledge.
Analysis of student work can help deepen teachers’ understanding of how students think about mathematics/science ideas. Experienced program leaders note that in designing these types of experiences for teachers, the selection of appropriate work samples is key. When your purpose is to illustrate how a group of students might think about a concept, it is essential that the individual examples of student work be detailed enough to show how students are thinking about a concept. In addition, the set of examples needs to demonstrate an appropriate range of how students typically think about a problem or concept.
Although most advice from program leaders focused on experiences that demonstrated the range of student thinking, one program leader emphasized that showing this range is not the only purpose for using student work samples. This individual cited a number of other possible purposes:
Sets of student work samples can comprise a single student’s work over time to show increasing depth of understanding. Or sets of work can be assembled to show a variety of pedagogical content knowledge representations of an idea and how students responded. Or sets of work can be assembled to show the persistence of a common misconception across students or with a single student over time.
Another program leader noted that analysis of student work, while beneficial, can be a very time consuming task. Given the limited amount of time likely to be available for professional development, and given that it might not be feasible to engage teachers with a wide range of student thinking about a variety of topics, this individual suggested that professional development should be explicit about the types of understanding that are most important for teachers to gain: “Help teachers understand what the most common student responses will be (and the most problematic responses, and the desired responses) and why, and how to deal with each of these.”
Learn, then apply—Teachers should first analyze a carefully crafted set of student work before analyzing their own students’ work.
Experienced program leaders recommended that teachers’ initial experiences with analyzing student work use a prepared set of examples rather than work that teachers bring in to the professional development. First, student work external to their classrooms is “safe” for teachers, who may otherwise feel the need to defend their students’ work, and by extension their own instruction. As one program leader stated, “I found that starting with work of students not in the professional development team’s classes [was] helpful. The ‘teacher-neutral’ student work provided a context for learning how to analyze student work without need to defend its quality and for developing a trusting collegial team. These activities set the stage for the analysis by the team of their own students’ work. Distinguishing critique from analysis is an essential prerequisite to team analysis of their own students’ work.”
Second, since the range of responses in a single class may be quite limited, the use of carefully-selected samples can ensure that teachers engage with a range of student work, aiding in the development of the skills necessary for analysis. Third, pre-selection of student work samples helps professional development facilitators be prepared in advance to support teachers in analyzing student work, focusing on evidence of what students do and do not know. Some program leaders suggested that using samples of student work provided by, or closely related to, the instructional materials that teachers use in their classes was particularly helpful in increasing the interest and motivation of the participating teachers.
Whatever the source of student work samples, it is important to provide a conceptual framework for teachers to use in examining student work that makes explicit the targeted mathematical/science ideas and how these ideas are connected.
Once teachers begin to become comfortable with the process of analyzing prepared student work samples, program leaders suggest they should move to analysis of their own students’ work. Not only is their own students’ work likely to be intrinsically more interesting to teachers, examining that work allows teachers to use the knowledge gained about student understanding of a concept to reflect on their own instructional practice.
A number of program leaders suggested that excerpts of lesson videos could also be useful fodder for sessions meant to deepen teachers’ understanding of student thinking. One MSP CO-PI shared:
We have had remarkably good success showing videos of students working problems. I think that teachers relate to this because they can compare the students’ thinking with students from their own class. And, of course, the students are often engaging – somewhat like the class we wish we had.
Similar to the use of student work samples, program leaders recommended that video excerpts be “rich” enough to provide evidence of how students are thinking about important mathematics/science concepts (e.g., students talking about how they approached a problem or activity, not simply giving “the answer.”). Videos of students are inherently interesting to teachers and provide a mechanism for engagement in the professional development. Program leaders caution, however, that it is important to keep the focus on the purpose of understanding student thinking, resisting the temptation participants might have to critique teacher decision-making if that is the purpose of the activity.
Focus on the positive—Analysis of student work should include a focus on what students understand, not just what they do not understand.
Regardless of whether teachers in the professional development are novices or experienced in analyzing student work samples, program leaders suggested that both the session design and its implementation need to keep an explicit focus on what students understand, countering the tendency teachers have to focus only on what students “got wrong.” In addition, sessions need to emphasize the evidence from the student work that provides information on what a student understands. Said one MSP program leader:
Evidence is key – teachers have to learn to use evidence from the student work, not make assumptions. Learning to look at student work required learning how to observe the data (student work), make inferences about their understanding, and considering what additional evidence you may need to support your inference. This is critical to letting go of “I taught it therefore they learned it.”
Which comes first?—Practitioners have different perspectives on the level of teachers’ content knowledge needed prior to the analysis of student understanding.
While program leaders expressed similar views on the nature of tasks to engage teachers with student thinking in professional development settings, there seemed to be different views on the depth of content knowledge that teachers needed in order to analyze student thinking. Some program leaders indicated that teachers need to have an understanding of the content in order to make sense out of student thinking and that it is impossible for teachers to focus on student understanding unless they first have a clear understanding of the content themselves. Said one program leader,
Unless a teacher has an understanding of the concept, he/she will find it difficult to truly evaluate the validity of a student’s thinking. A teacher who is just familiar with the content (does not have a rich understanding) might be able to follow some of the students’ logic, but might have difficulty evaluating obscure methods that some students might have used. He/she would also find it more difficult to analyze any misconceptions that might be found in the work.
Others argued that these purposes can be accomplished simultaneously; that through a facilitated examinations of student student work, teachers can both develop their own content knowledge and engage with student thinking about content. For one program leader, addressing teacher and student understanding of the concepts at the same time provides a particularly valuable way to address teacher content deficiencies. In this program leader’s words,
I think teachers learn the content as they engage with student thinking. They do not have to know it all first. This is a very productive way to engage in asking yourself questions about what it means to understand and to ask it in a way that is less threatening as it is about the students.
Bring content in through the back door—Analysis of assessment items provides opportunities to deepen teachers’ understanding of mathematics/science content.
Engaging teachers in developing and analyzing student assessment items can provide opportunities to deepen teachers’ understanding of mathematics/science content, as they consider which responses would be correct and why. While program leaders indicated that there was value in this practice, they also cautioned that the actual development of assessments is very difficult and the potential benefits to teachers’ content knowledge can easily be sidetracked by concerns over item design. It was noted that it might be more productive to have teachers analyze and react to items rather than develop the items themselves.
If you are interested in how these practitioner insights were collected and analyzed, a summary of the methodology can be found here.
Teacher Content Knowledge Matters
Empirical evidence demonstrates that teachers’ mathematics/science content knowledge makes a difference in their instructional practice and their students’ achievement. Consistent findings across studies include:
- Teachers’ mathematics/science content knowledge influences their professional practice.
- Teachers’ mathematics/science content knowledge is related to their students’ learning.
Learn more about research on why teachers’ mathematics/science content knowledge matters.
Research on Engaging Teachers in Considering Student Thinking about Mathematics and Science
Research studies of 12 interventions that engaged teachers in considering student thinking about mathematics content, as one of several strategies, were identified in a search of the published literature. All 12 provided evidence of positive effects on teachers’ disciplinary and/or pedagogical content knowledge in mathematics (Basista & Mathews, 2002; Clark & Schorr, 2000; Dole, Clark, Wright, Hilton, & Roche, 2008; Ellington, Whitenack, Inge, Murray, & Schneider, 2009; Empson, 1999; Featherstone, Smith, Beasley, Corbin, & Shank, 1995; Franke, Carpenter, Fennema, Ansell, & Behrend, 1998; Goldsmith & Seago, 2007; Miller, 1991; Sowder, Phillip, Armstrong, & Schappelle, 1998; Stecher & Mitchell, 1995; Swafford, Jones, & Thornton, 1997; Swafford, Jones, Thornton, Stump, & Miller, 1999). Teacher participants in the interventions ranged from Kindergarten to grade 12. Across the studies, topics in number and operations, algebra, geometry, measurement, and data/probability/statistics were addressed, as well as mathematical processes of communication, representation, and problem solving. Although no studies investigated the unique contribution of engaging teachers with student thinking in mathematics as a professional development strategy, consistent positive results across the programs support claims regarding its effectiveness in deepening teachers’ disciplinary and pedagogical content knowledge in mathematics.
Research on Engaging Teachers in Considering Student Thinking about Mathematics
Professional learning opportunities for teachers of mathematics often include attention to student thinking as a part of deepening teachers’ mathematics disciplinary and/or pedagogical content knowledge. Several different approaches to engaging teachers in considering student thinking were found in research studies investigating the effects of interventions on teachers’ mathematics content knowledge. Twelve research studies investigated professional development programs that included this strategy.
What Research Says
Each of the 12 studies of professional learning experiences that included engaging teachers in considering student thinking about mathematics provided positive results on participating teachers’ content knowledge. Although none of these studies investigated the unique contribution of the strategy of engaging teachers with attending to student thinking, consistent positive results across programs support claims regarding its effectiveness in deepening teachers’ mathematics content knowledge.
The 12 studies were spread across grades Kindergarten through 12, with more studies in the elementary and middle grades than the high school grades. Across the studies, topics in number and operations, algebra, geometry, measurement, and data/probability/statistics were addressed, as well as mathematical processes of communication, representation, and problem solving. The experiences for teachers in these 12 studies were quite varied. Eight of the studies examined the impacts of formal professional development experiences on teachers’ mathematics content knowledge. Two were structured as four-week summer institutes (Basista & Mathews, 2002), one of which also provided six half-day follow-up sessions during the ensuing academic year (Swafford, Jones, & Thornton, 1997; Swafford, Jones, Thornton, Stump, & Miller, 1999). One was a ten-day workshop (Ellington, Whitenack, Inge, Murray, & Schneider, 2009), one was structured as a semester-long course (Clark & Schorr, 2000), and one took the form of 12 three-hour sessions (Goldsmith & Seago, 2007). Another study involved teachers in regular meetings with university faculty on a monthly basis (Sowder, Phillip, Armstrong, & Schappelle, 1998) over several academic years, as well as visits by university faculty to the teachers’ classrooms. One study included professional learning groups throughout the school year, during professional development days and afternoon workshops (Dole, Clark, Wright, Hilton, & Roche, 2008). Three of the studies investigated teaching practice as a context for teachers to deepen their content knowledge, although it is important to note that the teacher participants in each of these three studies had also engaged in related professional development (Empson, 1999; Featherstone, Smith, Beasley, Corbin, & Shank 1995; Miller, 1991). One other study researched impacts on teachers’ content knowledge related to both a professional development experience and subsequent teaching practice (Franke, Carpenter, Fennema, Ansell, & Behrend, 1998).
The means of engaging teachers in considering student thinking also varied across the interventions that were studied. Ten of the programs included activities to engage teachers directly in analysis of student thinking. Five of these ten engaged teachers with written student work, including one in which teachers learned about scoring prepared samples of written student work using rubrics (Stecher & Mitchell, 1995) and four in which teachers were provided with prepared samples of student work (Dole et al., 2008; Ellington et al., 2009; Franke et al., 1998; Goldsmith & Seago, 2007). One of these studies also included work from participating teachers’ own students (Franke et al., 1998). Two additional initiatives had teachers interview students about a mathematical topic or their thinking about problems (Sowder et al., 1998; Swafford et al., 1997; Swafford et al., 1999). Another program had its teacher participants implement problems in their own classrooms and subsequently reflect on their students’ mathematical thinking about those problems (Clark & Schorr, 2000). Similarly, one study of teacher learning from practice involved teachers in reflecting on what their students wrote in response to specific mathematics writing prompts (Miller, 1991). The other program that engaged teachers with analyzing student thinking was also included in a study of teacher learning from practice, but its strategies were somewhat variable from teacher to teacher, including watching videos of instruction and examining written work from other teachers’ classrooms, and implementing problems in teachers’ own classrooms and reflecting on how the students’ thought about them (Featherstone et al., 1995).
Strategies for engaging teachers in considering student thinking other than direct analysis of student work were included in four interventions that were studied. Two of the interventions provided experiences in which teachers learned about research studies and their findings on student thinking in mathematics (Franke et al., 1998; Sowder et al., 1998). One of the programs had a focus on pedagogical issues that included strategies for assessing students’ prior understandings in mathematics (Basista & Mathews, 2002). Finally, one study examined teacher learning from practice as they implemented a new mathematics curriculum material that provided activities to focus their instructional practice on their students’ thinking (Empson, 1999).
In addition to the use of at least one strategy for engaging teachers in considering student thinking, some other potentially important commonalities are evident among the experiences studied within subsets of the 12 studies. First, across all 12 studies, the strategies for engaging teachers in considering student thinking were designed specifically to help teachers connect what they were learning about mathematics content and student thinking to their classroom teaching. Second, 10 of the 12 programs (Basista & Mathews, 2002; Clark & Schorr, 2000; Dole et al., 2008; Ellington et al., 2009; Featherstone et al., 1995; Franke et al., 1998; Goldsmith & Seago, 2007; Miller, 1991; Sowder et al., 1998; Swafford et al., 1997; Swafford et al., 1999) engaged teachers in a fairly lengthy and intensive program focused on mathematics content and mathematics teaching. Third, the same 10 experiences included facilitation or partnership roles involving university faculty from mathematics or mathematics education departments. None of these features was studied systematically for its contribution to the outcomes, but their common occurrence in these experiences suggests some potential importance with respect to the goal of deepening teachers’ content knowledge through attending to student thinking.
All 12 of the experiences investigated in these studies included goals for developing aspects of teachers’ knowledge of mathematics-specific pedagogy. Eight of the experiences also aimed at deepening teachers’ disciplinary mathematics content knowledge (Basista & Mathews, 2002; Clark & Schorr, 2000; Dole et al., 2008; Featherstone et al., 1995; Franke et al., 1998; Miller, 1991; Sowder et al., 1998; Swafford et al., 1997; Swafford et al., 1999). However, measures for these targeted areas of teacher knowledge were not used in all of the studies. Specifically, only four of the studies used measures of disciplinary content knowledge (Basista & Mathews, 2002; Clark & Schorr, 2000; Sowder et al., 1998; Swafford et al., 1997; Swafford et al., 1999).
With the exception of one study (Stecher & Mitchell, 1995), teachers participated in each of these experiences on a voluntary basis, so generalizability of the findings from these studies must be considered in this light. The populations that the participating teachers represent are limited to those willing and able to commit to the interventions, which, as noted above, were typically extensive in duration.
Seven of the studies used either a pre-post design to measure changes in teachers’ content knowledge or traced changes over time (Basista & Mathews, 2002; Clark & Schorr, 2000; Dole et al., 2008; Ellington et al., 2009; Goldsmith & Seago, 2007; Sowder et al., 1998; Swafford et al., 1997; Swafford et al., 1999), while the other five presented only post-experience data. Given the experience levels of many of the participating teachers, the extent of professional development provided, and the nature of the measured changes, it is certainly reasonable to argue that changes resulted from the interventions, but without comparisons to other teachers, or over time, these claims are not solidly grounded in empirical evidence.
Only one of the studies used comparison groups of teachers who did not participate in the professional development programs or teaching experiences that were being investigated (Goldsmith & Seago, 2007). Consequently, it is possible that participating teachers might perform better on a measure of content knowledge on a post-test simply because they had completed it previously, in one case (Basista & Mathews, 2002) only a few weeks before. The use of multiple measures addresses this concern to some extent. In the Swafford and colleagues (1997, 1999) study, participating teachers performed better on written assessments in three different content areas, and on three separate measures of knowledge of geometry, following treatment. A number of studies used measures drawing on different data collection strategies to triangulate findings. Two studies used written assessments, interviews, and observations as measures of content-related knowledge (Clark & Schorr, 2000; Sowder et al., 1998). Others used two of these three strategies: written assessments and observations (Basista & Mathews, 2002); interviews and observations (Featherstone et al., 1995; Franke et al., 1998; Miller, 1991); or written assessments and interviews (Sowder et al., 1998). Only 2 of the 12 research efforts (Goldsmith & Seago, 2007; Swafford et al., 1997; Swafford et al., 1999) used any externally developed measures of teacher content knowledge.
Additional limitations were noted regarding some of these studies. In three studies, the intervention being investigated was described in very little detail, limiting the strength of interpretations linking teachers’ experiences with the results (Clark & Schorr, 2000; Dole et al., 2008; Empson, 1999). Description of analysis procedures in three studies was not adequate, either providing insufficient evidence to link the results to the teachers’ experiences (Clark & Schorr, 2000), or not describing how qualitative examples were selected from the full range of available data (Empson, 1999; Miller, 1991). In one study (Ellington et al., 2009), substantial differences in the administration of the pre- and post-tests raise questions about validity of the results.
For the research on engaging teachers in considering student thinking about mathematics bibliography, click here. [PDF 10K]
The 12 studies described above were part of a more inclusive review of research on experiences intended to deepen teachers’ mathematics content knowledge. For more information, you are invited to read a summary of research on experiences intended to deepen teachers’ mathematics content knowledge click here. [PDF 120K]
The literature search surfaced nine research studies of professional development programs that included engaging teachers in considering student thinking about science content. Each intervention included several strategies, and none was designed to measure the unique influence of attention to student thinking. Still, each one reported evidence that teachers’ disciplinary and/or pedagogical content knowledge in science increased (Basista & Mathews, 2002; Dole et al., 2008; Drechsler & van Driel, 2008; Heller, Daehler, & Shinohara, 2003; Lee, Lewis, Adamson, Maerten-Rivera, & Secada, 2008; Puttick & Rosebery, 1998; Schibeci, & Hickey, 2000; Shymansky et al., 1993; van Driel, Verloop, & de Vos, 1998). Although teacher participants in the studies ranged from Kindergarten to grade 12, most of the research was focused on elementary and middle grades. Further, although earth, life, and physical science were represented, physical science was by far the most frequently studied content area.
Research on Engaging Teachers in Considering Student Thinking about Science
Professional learning opportunities for teachers of science have increasingly focused teachers on student thinking about science ideas as a way of deepening teachers’ disciplinary and/or pedagogical content knowledge. Nine research studies investigated professional development programs that included this strategy.
What Research Says
The nine studies were concentrated in the elementary and middle grades, although teacher participants ranged from Kindergarten to grade 12. Across the studies, topics in earth, life, and physical science were addressed, with 7 of the 9 studies focused specifically on physical science. The experiences for teachers in these studies were quite varied, ranging from an inquiry into various science topics over four years (Rosebery & Puttick, 1998) to a two-day workshop on “natural” vs. “processed” materials (Schibeci & Hickey, 2000). Between these extremes were a four-week summer institute (Basista & Mathews, 2002); a professional learning group held during five professional development days and four afternoon workshops (Dole, Clark, Wright, Hilton, & Roche, 2008); monthly teacher meetings over the course of a school year (Heller, Daehler, & Shinohara, 2003); five full-day workshops throughout the academic year (Lee, Lewis, Adamson, Maerten-Rivera, & Secada, 2008); a teacher training course (Drechsler & van Driel, 2008); a one-week summer workshop, preceded by an orientation session and followed by academic year activities (Shymansky et al., 1993); and an academic year workshop with sessions prior to, during, and following a unit of instruction on chemical equilibrium (van Driel, Verloop, & de Vos, 1998).
The interventions were just as varied in how they incorporated attention to student thinking. Two studies (Shymansky et al., 1993; van Driel et al., 1998) shared a similar structure of having teachers reflect on student ideas, teach a unit on the relevant content, and reflect again on student thinking. Shymansky and colleagues (1993) had teachers interview students prior to teaching, whereas van Driel and colleagues (1998) presented teachers with written student responses to assessment tasks. Two other studies also used prepared student responses: Dole and colleagues (2008) had teachers comment on prepared student responses on a survey, and Heller and colleagues (2003) investigated authentic classroom cases linked to hands-on investigations. The teacher described by Rosebery and Puttick (1998) spent several years teaching science to children and reflecting with a small group of teachers and researchers on the students’ learning. In each of these three studies, student thinking was a focal point of the intervention. The remaining studies (Basista & Mathews, 2002; Lee et al., 2008; Schibeci & Hickey, 2000) included student thinking as a secondary emphasis.
Teachers participated in each of these experiences on a voluntary basis, so generalizability of the findings from the studies must be considered in this light. In addition, for 7 of the 9 studies, the populations that the participating teachers represent are limited to those willing and able to commit to extensive interventions.
All but 2 (Drechsler & van Driel, 2008; van Driel et al., 1998) of the 9 studies looked at impacts on teachers’ disciplinary knowledge. Positive impacts were reported in each instance, although two studies (Schibeci & Hickey, 2000; Shymansky et al., 1993) reported mixed results. Only Dreschler and van Driel (2008) and van Driel and colleagues (1998) studied changes in pedagogical content knowledge, both reporting positive impacts.
Although all of these studies used either a pre-post design to measure changes in teachers’ content knowledge or traced changes over multiple points in time, only one (Lee et al., 2008) used comparison groups of teachers who did not participate in the professional development programs. It is possible that participating teachers might perform better on a measure of content knowledge or pedagogical content knowledge on a post-test simply because they had completed it previously, in some cases a relatively short time before (Basista & Mathews, 2002; Schibeci & Hickey, 2000). Only one of the studies used an externally-developed measure of teacher content knowledge (Schibeci & Hickey, 2000). In most of the studies, little information was provided on how the measures were developed and validated for the purpose of assessing growth in teachers’ disciplinary or pedagogical content knowledge.
Additional limitations were noted regarding some of these studies. In one study (Schibeci & Hickey, 2000), no statistical tests were performed to substantiate claims of impacts on content knowledge. In two other studies (Heller et al., 2003; van Driel et al., 1998), the analysis of the data was not described in enough detail to evaluate the evidence to support the researchers’ claims.
For the research on engaging teachers in considering student thinking about science bibliography, click here. [PDF 8K]
The nine studies described above were part of a more inclusive review of research on experiences intended to deepen teachers’ science content knowledge. For more information, you are invited to read a summary of research on experiences intended to deepen teachers’ science content knowledge click here. [PDF 134K]