Teacher Knowledge: Analyzing Classroom Instruction

Professional learning opportunities for teachers of mathematics and science have increasingly focused on teachers’ content knowledge, including their pedagogical content knowledge. Learning opportunities aimed at deepening teachers’ pedagogical content knowledge often include a strategy of examining mathematics/science classroom practice. Advice from experienced practitioners offers guidance about engaging teachers in analyzing classroom instruction as a strategy for deepening their pedagogical content knowledge. Insights provided by a group of expert practitioners with diverse backgrounds and experiences in working with teachers included the following ideas:

  • Content knowledge is a prerequisite—Teachers need to have an understanding of the mathematics/science content prior to analyzing how classroom practice is intended to develop students’ understanding.
  • Create a safe space for learning—Teachers should analyze classroom practice of unknown teachers before analyzing their own practice or that of their colleagues.
  • Lean on me—Teachers need time and support to develop the analytic skills to examine instruction critically and productively.
  • Frame the learning—The examination of instruction should include scaffolding to focus teachers on how student learning of the targeted mathematics/science concepts is or is not supported.
  • Consider critical junctures—Analysis of classroom practice should focus on instructional decision points.

Practitioner Insights

Teaching involves making minute by minute decisions about instruction – what questions to ask, of which students; when to let students grapple with a problem on their own, and when to step in with hints or explanations; when to move on to the next activity, etc. Providing mathematics/science teachers with opportunities to analyze classroom practice – their own or that of someone else – is one strategy employed in professional development to give teachers practice in considering the affordances of different instructional moves so they will be better prepared for their future instructional decisions.

Experienced program leaders offered some insights about engaging teachers in the analysis of classroom practice which are described below. After reviewing these insights, you will be provided with opportunities to share your own experiences with using this strategy in professional development. The information you provide will be analyzed along with the insights and examples from other practitioners as the website is periodically updated.

Content knowledge is a prerequisite—Teachers need to have an understanding of the mathematics/science content prior to analyzing how classroom practice is intended to develop students’ understanding.

Professional development programs may include having teachers analyze videos or transcripts of classroom instruction, either those they prepare themselves, or examples provided in professional development materials. Experienced program leaders noted that there are a number of prerequisites for teachers to be able to make progress in analyzing classroom instruction. Teachers should themselves have a good understanding of the targeted mathematics/science concepts, and they should understand how the student tasks used in the instances of practice they will analyze are intended to develop those concepts. In addition, teachers should be aware of how students typically think about the ideas. Toward that end, program leaders suggested that teachers should first work on the tasks themselves and then examine artifacts of instruction before thinking about potential instructional “moves.” This approach will allow teachers to focus on how the instruction they are analyzing is and is not developing student understanding.

Create a safe space for learning—Teachers should analyze classroom practice of unknown teachers before analyzing their own practice or that of their colleagues.

In teachers’ early experiences analyzing instruction, program leaders recommended using instructional episodes—whether videos, transcripts, or cases—from teachers that are unknown to those participating in the professional development. Analyzing instruction implemented by someone external to the group avoids defensiveness about what a teacher did and did not do in a lesson, and allows the facilitator time to create both a focus on mathematics/science content and a positive culture of professional reflection. In addition, the program designer can select rich teaching episodes that focus on important mathematics/science content, those which provide evidence of how students are thinking about the content and how they are reacting to instructional moves. Using these episodes, teachers can develop sensibilities and skills that can be applied to future analyses. As teachers gain these skills and become comfortable with the process, the sessions can shift to examining the teachers’ own mathematics/science instruction. When teachers are at this point, program leaders suggest that professional development progress carefully, having teachers initially describe their mathematics/science instruction and bring in artifacts from that instruction such as student work samples before introducing videotapes of a teacher’s instruction.

Lean on me—Teachers need time and support to develop the analytic skills to examine instruction critically and productively.

Experienced program leaders emphasized that teachers need to have continuing support and practice in building the mindset and skills necessary for instructional analysis. One program leader described how teachers’ talk developed with this type of support in a professional development program:

Teachers asked each other questions about how they got their students to talk about or share their thinking. As the year progressed, teachers changed what they brought in terms of the detail of student work and the types of conversations they had about teaching. We had to work hard through our facilitation to get teachers to detail the strategies [students used] enough to differentiate them (beyond “they used counters” – to how they used counters and so on). As time went on, teachers did the detailing more automatically.

Frame the learning—The examination of instruction should include scaffolding to focus teachers on how student learning of the targeted mathematics/science concepts is or is not supported.

Program leaders recommend that the experience of analyzing teaching episodes (either video or lesson vignettes/descriptions) should be structured to focus participants on aspects of the lesson that did or did not promote student learning. The sessions, according to these experts, should include an analytic frame for considering how the teachers’ instructional moves supported the students’ learning of mathematics/science and consider why tasks did or did not play out as intended. For example, the Task Analysis Guide is a frame that has been used in mathematics professional development to structure analysis of how teachers do or do not maintain high cognitive demand of tasks they assign their students.†

Consider critical junctures—Analysis of classroom practice should focus on instructional decision points.

Experienced program leaders recommended focusing teachers on decision points in instruction. For example, they suggested that teachers be given practice in describing how students might respond to particular tasks and identifying which responses they would select to be presented, and in what order, to help develop student understanding of the mathematics/science concepts. They also recommended focusing on the questions that might be asked of students regarding particular solutions. A program leader described one such approach as follows:

After they solved the Orange Juice Task from Connected Mathematics Program, I had teachers analyze the 12 solutions produced by students. The solutions highlighted a range of approaches (unit rate, scale factor, percent, tables, pictures), levels of sophistication, and correctness. The work provided a rich context for talking about ways of comparing ratios as well as what constitutes a good response, how to orchestrate a discussion of varied responses so as to highlight the key mathematics, and for determining the questions that can be asked in order to assess and advance students’ thinking…I have asked teachers in their lesson planning of similar tasks to indicate how they expect students to respond to particular tasks and to identify the responses they would select in order to launch a discussion of the mathematics.

Another experienced practitioner noted that it is important for professional development to address what teachers are most likely to see from students in response to certain tasks and prepare teachers to respond appropriately:

It’s certainly fun for teachers to look at “interesting” examples of student work, and this is exactly what we see in much professional development. (In some unfortunate cases of professional development, this is all we see.) But it’s more important, and more instructive for teachers, to see examples of what is typical (both right and wrong), as well as unusual approaches or interpretations, so teachers can be prepared to respond in ways that will move students forward toward richer understanding.

If you are interested in how these practitioner insights were collected and analyzed, a summary of the methodology can be found here.

† Stein, M K., Smith, M. S., Henningsen, M. A., and Silver, E. A. (2000). Implementing standards-based mathematics instruction: A casebook for professional development. New York, NY: Teachers College Press.


Teacher Content Knowledge Matters

Empirical evidence demonstrates that teachers’ mathematics/science content knowledge makes a difference in their instructional practice and their students’ achievement. Consistent findings across studies include:

  • Teachers’ mathematics/science content knowledge influences their professional practice.
  • Teachers’ mathematics/science content knowledge is related to their students’ learning.

Learn more about research on why teachers’ mathematics/science content knowledge matters.

Research on Engaging Teachers in Analyzing Mathematics or Science Classroom Instruction

Studies of seven professional development programs that engaged teachers in analyzing mathematics classroom instruction, as one of several strategies, were identified in a search of the published literature. Findings of the studies for all seven interventions provided evidence of positive effects on teachers’ mathematics content knowledge (Basista & Mathews, 2002; Goldsmith & Seago, 2007; Lin, 2002; Santagata, 2009; Sowder, Phillip, Armstrong, & Schappelle, 1998; Swafford, Jones, & Thornton, 1997; Swafford, Jones, Thornton, Stump, & Miller 1999; Vale & McAndrew, 2008). Teacher participants in the interventions ranged from grade 1 to grade 12. Across the studies, topics in number and operations, algebra, geometry, and data/probability/statistics were addressed. Although no studies investigated the unique contribution of engaging teachers in analyzing mathematics classroom instruction, consistent positive results across the programs support claims regarding its effectiveness in deepening teachers’ mathematics disciplinary and/or pedagogical content knowledge.

Research on Engaging Teachers in Analyzing Mathematics Classroom Instruction

Professional learning opportunities for teachers of mathematics have increasingly focused on deepening teachers’ mathematics disciplinary and/or pedagogical content knowledge. Understanding mathematics concepts more deeply, and what students might think about mathematics concepts, are important parts of teaching, but what impacts students directly is what teachers do with this knowledge in their instructional practice. One method of addressing teachers’ understanding of the interface of mathematics disciplinary and/or pedagogical content and student thinking during instruction is to provide them with experience in considering how particular mathematics tasks and instructional strategies develop student understanding, and the opportunity to apply and reflect on this knowledge as it relates to their own classroom instruction. Seven research studies identified in this review investigated professional development programs using versions of this strategy.

Findings from Research

Studies of all but one of the seven professional learning experiences that included analyzing classroom instruction found positive results on participating teachers’ content knowledge. The Santagata (2009) study reported no conclusions about increases in teacher content knowledge, focusing instead on the difficulties teachers encountered with the intervention. Although none of these studies investigated the unique contribution of the strategy of analyzing classroom instruction, generally positive results across programs support claims regarding its effectiveness in deepening teachers’ mathematics disciplinary and pedagogical content knowledge.

The studies spanned a breadth of grade levels, with teacher participants ranging from grade 1 to grade 12, but concentrated in the upper elementary and middle grades. Across the studies, topics in number and operations, algebra, geometry, data/probability/statistics, communication, problem solving, and representation were addressed. The experiences for teachers in these seven interventions included two structured as four-week summer institutes (Basista & Mathews, 2002), one of which also provided six half-day follow-up sessions during the ensuing academic year (Swafford, Jones, & Thornton, 1997; Swafford, Jones, Thornton, Smith, & Miller, 1999). A third involved teachers in monthly meetings with university faculty over two academic years, as well as visits by university faculty to the teachers’ classrooms (Sowder Phillip, Armstrong, & Schappelle, 1998). Four studies focused on professional development activities that took place periodically throughout the academic year (Goldsmith & Seago, 2007; Lin, 2002; Santagata, 2009; and Vale & McAndrew, 2008). The Lin (2002) and Goldsmith and Seago (2007) studies involved reflecting on cases from classroom teaching. In the Santagata (2009) study, teachers watched and analyzed videos of clasrooms, and then taught the same lessons they had analyzed. The intervention in the Vale and McAndrew (2008) study involved analysis of mathematics problems and activities, analysis of teaching materials and strategies, design of problems and activities, and analysis of students’ solutions.

In addition to the use of the strategy of engaging teachers in analyzing classroom instruction, some other commonalities are evident among these seven experiences. First, all seven programs included strategies to help teachers connect the mathematics content they were learning to their classroom teaching. Second, all seven experiences included facilitation by university faculty from mathematics or mathematics education departments. Neither of these features was studied systematically, but their common occurrence in these experiences suggests some potential importance with respect to the goal of deepening teachers’ disciplinary and/or pedagogical content knowledge.

In all but one case (Santagata, 2009) in which the professional development was made mandatory by the district, teachers participated in these experiences on a voluntary basis, so generalizability of the findings from these studies must be considered in this light. The populations that the participating teachers represent, with this one exception, are limited to those willing and able to commit to such extensive interventions.

There were potential validity issues with the measures of teacher content knowledge used in each of the studies. Although all of these studies either used a pre-post design to measure changes in teachers’ content knowledge or traced changes over time, only one of the studies (Goldsmith & Seago, 2007) used comparison groups of teachers who did not participate in the professional development programs. It is possible that participating teachers might perform better on a measure of content knowledge on a post-test simply because they had completed it previously, in one case (Basista & Mathews, 2002) only a few weeks before. The use of multiple measures addresses this concern to some extent; for example, in the Swafford and colleagues (1997, 1999) study, the participating teachers performed better in three different content areas, and on three separate measures of knowledge of geometry, following treatment. Similarly, the Sowder and colleagues (1998) study used written instruments and interviews with teachers to triangulate findings, and the Santagata (2009) study supplemented survey data with field notes and memos. Only 3 of the 7 studies, including 2 studies of one intervention (Goldsmith & Seago, 2007; Swafford et al. 1997; Swafford et al., 1999) used any externally-developed measures of teacher content knowledge. Otherwise, little information was provided on how the measures used in the studies were developed and validated for the purpose of assessing growth in teachers’ content knowledge.

Additional limitations were noted regarding some of these studies. In 5 of the 7 studies the intervention was delivered by the researchers, which raises concern of possible researcher bias, calling into question both the replicability of the interventions and generalizability of the results (Basista & Mathews, 2002; Goldsmith & Seago, 2007; Lin, 2002; Sowder et al., 1998; Swafford et al., 1997; Swafford et al., 1999). In addition, there is a possibility of selection bias in two of the studies (Lin, 2002; Sowder et al., 1998), as teacher participants were chosen based on prior relationships with the researchers.

For the research on engaging teachers in analyzing mathematics classroom instruction bibliography, click here. [PDF 8K]

The seven studies described above were part of a more inclusive review of research on experiences intended to deepen teachers’ mathematics content knowledge. For more information, you are invited to read a summary of research on experiences intended to deepen teachers’ mathematics content knowledge, click here. [PDF 120K]

The literature search surfaced seven research studies of professional development programs that engaged teachers in analyzing science classroom instruction. Each intervention included several strategies; none was designed to measure the unique influence of engaging teachers in analyzing classroom instruction. Still, each one reported some evidence that teachers’ science disciplinary and/or pedagogical content knowledge increased (Basista & Mathews, 2002; Clermont, Krajcik, & Borko, 1993; Heller, Daehler, & Shinohara, 2003; Lustick & Sykes, 2006; Puttick & Rosebery, 1998; Wang, 2001; Williamson & Jose, 2008). Teacher participants in the studies ranged from Kindergarten to grade 12. Various earth, life, and physical science topics were represented.

Research on Engaging Teachers in Analyzing Science Classroom Instruction

Professional learning opportunities for teachers of science have increasingly focused on deepening teachers’ disciplinary and/or pedagogical content knowledge. Understanding both science concepts and what students are thinking about these concepts are important parts of teaching, but what impacts students directly is what teachers do with this knowledge. One method of addressing teachers’ understanding of student thinking about science is to provide them with experience in considering how tasks and instructional moves might help to develop student understanding of particular science concepts, including an opportunity to apply and reflect on this knowledge as it relates to their own classrooms. Seven research studies identified in this review investigated professional development programs using this strategy.

Findings from Research

Each of the seven studies of interventions that included analyzing classroom instruction reported at least some positive impacts on participating teachers’ content knowledge. Although none of the studies investigated the unique contribution of the strategy of analyzing classroom instruction, consistent positive results across programs support claims regarding its effectiveness in deepening teachers’ science content knowledge.

Teacher participants in the seven studies ranged from grades Kindergarten through 12. Four of the studies focused on physical science, and the other three focused on more than one science field. One of the studies focused on teachers’ understanding of the nature of science rather than on any particular disciplinary content area (Wang, 2001).

The experiences for teachers in the seven studies varied widely in intensity and duration, including two summer institutes (Basista & Mathews, 2002; Williamson & Jose, 2008), an independent inquiry into various science topics over a four year period (Rosebery & Puttick, 1998), and three in-service programs, one lasting two weeks (Clermont, Krajcik, & Borko, 1993) and two occurring over the course of a year. One of the studies during the academic year included six to eight three-hour sessions of hands-on investigations linked to teaching cases (Heller, Daehler, & Shinohara, 2003) and the other included nine two-hour meetings with the participants, planned visits to classrooms for participants to observe instruction, and meetings to debrief with the course instructors following the observations (Wang, 2001). A final study examined the National Board Certification process, which involves teachers in developing an extensive portfolio about their practice and taking an exam (Lustick & Sykes, 2006).

In addition to the diversity of the interventions investigated, the studies also used a variety of measures to assess teachers’ gains in content knowledge. Four studies used written tests. Basista and Mathews (2002) used an assessment developed for the institutes they studied, which focused on the integration of science and mathematics (e.g., using mathematical modeling of population growth); Heller and colleagues (2003) developed an assessment using items from the National Assessment of Educational Progress (NAEP) and Trends in International Mathematics and Science Study (TIMSS); Wang (2001) used the Views of Nature of Science (VNOS) instrument; and Williamson and Jose (2008) used the General Chemistry Conceptual Evaluation as well as a series of spatial ability tests. The other three studies used case studies and interviews. Clermont and colleagues (1993) assessed teachers’ responses during interviews; Rosebery and Puttick (1998) analyzed videotapes of a case study teacher at three different time points; and Lustick and Sykes (2006) used observations and interviews around a video of classroom discussion, student artifacts, and descriptions of classroom situations.

Teachers participated in each of these experiences on a voluntary basis, so generalizability of the findings from the studies must be considered in this light. The populations that the participating teachers represent are limited to those willing and able to commit to such extensive interventions. The small sample sizes for 5 of the 7 studies (Basista & Mathews, 2002; Clermont et al., 1993; Puttick & Roseberry, 1998; Wang, 2001; Williamson & Jose, 2008) also raises questions about how broadly the results might be appropriately generalized.

Although all seven of these studies used either a pre-post design to measure changes in teachers’ content knowledge or traced changes over several points in time, none of the studies used comparison groups of teachers who did not participate in the professional development programs. It is possible that participating teachers might perform better on a measure of content knowledge on a post-test simply because they had completed it previously.

For the research on engaging teachers in analyzing science classroom instruction bibliography, click here. [PDF 8K]

The seven studies described above were part of a more inclusive review of research on experiences intended to deepen teachers’ science content knowledge. For more information, you are invited to read a summary of research on experiences intended to deepen teachers’ science content knowledge, click here. [PDF 187K]