School Improvement in Maryland

Key Understandings For CFIP

These key understandings will help school teams use the Classroom-Focused Improvement Process (CFIP) to improve teaching and increase student learning.

Work of Teacher Teams Resources:

Data Analysis in Your Team PDF, DOC

WHY DATA-BASED DECISION MAKING IS BEST DONE AT THE TEACHER TEAM LEVEL

The primary role of the school improvement team (SIT) is to develop a plan to improve the school. It was never intended -- nor designed -- to focus on improving the performance of individual students.

The data that SITs must use in developing the school improvement plan are usually too general and, according to testing expert James Popham, are "instructionally insensitive."6 That is, they are not intended by the state to provide information to drive daily instruction. Their purpose is to produce an accountability score and to provide very general guidance about schoolwide priority areas. In addition, the timelines required of a statewide school accountability system mean that the data used to develop school plans are usually out of date well before they are returned to schools and are often from students who have moved on to a new grade and, perhaps, to a new school.

Improvement plans based on state test data alone do not consider the wide variations that usually exist within and between grade levels and subject areas. State test data are best used to identify very broad strategies designed to increase the overall performance of groups needed to meet adequate yearly progress (AYP). The needs of individual students and teachers may be lost in this search for comprehensive initiatives that will make the difference at the school level.

The makeup of a typical school improvement team (often the school administrators, a teacher from each grade or content team, the media specialist, a few parents, and perhaps someone from the central office) and the fact that SITs typically meet once a month make it impossible for this diverse group to focus on the details of student performance that are necessary for daily improvement.

Most damaging of all, the effectiveness of the plan in improving student performance on the state test will not be known until the next assessment, perhaps a year away. These and other reasons keep SIT sessions from being the best environment for helpful, in-depth, and real-time data analysis to occur.

Data analysis is too important to be done using only state assessment data and only on a sporadic basis in school improvement teams. This critical work must occur, instead, on a regular basis by classroom teachers in grade-level, vertical, and content teams and be embedded into their ongoing instructional planning efforts.

Steele and Boudett report that one theme that cut across all the schools they studied using the Data Wise improvement process (an approach to school improvement developed by educators in the Boston Public Schools and at the Harvard Graduate School of Education) is that all the schools used data collaboratively. They concluded that the collaborative approach to data analysis yielded at least three major benefits:

  • Organizational learning -- developing the organization's skill at creating, acquiring, and transferring knowledge and modifying its behavior to reflect new knowledge and insights
  • Internal accountability -- increasing staff members' shared sense of responsibility to one another
  • Providing a safety net for professional growth -- increasing staff's willingness to take risks and improve their craft7

See Building a Collaborative Culture in the Schools for the critical elements and the conditions necessary to support this collaborative culture.

Effective data analysis sessions at the classroom level should:

  • Be done using real-time (current) data
  • Triangulate (bring together) a variety of data sources, including external assessment, benchmark or common assessment, and classroom assessment data
  • Occur at least once every two weeks and preferably more often8
  • Use a decision-making process that enables participants to build on previous sessions and not repeat the same issues over and over
  • Include a convenient and easy-to-use template to record results
  • Enable teachers to dig down into the data to uncover whole class strengths and weaknesses as well as individual students' learning needs
  • Result in enrichments and interventions for individual students that can be re-directed frequently if they are not working
  • Lead to instructional improvements that are carried out at a high level of quality
  • Have meaning for teachers, that is, teachers must believe the data analysis process to be a worthwhile use of their time.

A 2006 RAND research study found that, if teachers do not view assessment data as timely or, if they feel that the data do not accurately measure student learning, efforts to get them to use data will fall flat.9

Data Analysis in Your Team [PDF, DOC] is based on insights from this section. Use it to assess the readiness of teams in your school to conduct effective data analyses.

  • 6 Popham, J. (2007, October). Instructional insensitivity of tests: Accountability's dire drawback. Phi Delta Kappan, 89 (2), 146-150.
  • 7 Steele, J. & Boudette, K. (2008/2009, December/January). The collaborative advantage. Educational Leadership, 66 (4), 54-59.
  • 8 Bay Area School Reform Collaborative. (2003). After the test: How schools are using data to close the achievement gap. Retrieved September 9, 2008, from http://springboardschools.org/prof_dev/research_studies.html. Also Maxwell, L. (July 30, 2008). 70 districts compare practices on collecting, analyzing data. Education Week. Retrieved September 9, 2008, from www.edweek.org/ew/articles/2008/07/30/44apqc.h27.html. And, Oberman, I. & Symonds, K. W. (2005). What matters most in closing the gap. Leadership, 34 (3), 8-11.
  • 9 Marsh, J., Pane, J., & Hamilton, L. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research. Santa Monica, CA: RAND, p.10.