NJ Teacher Evaluation Pilot Program Year-One Report Released By Independent Rutgers Research Team

An independent research team at Rutgers University’s Graduate School of Education consisting of Dr. William Firestone, Dr. Drew Gitomer and Dr. Cindy Blitz has found that the NJ teacher evaluation pilot program made significant strides despite some start up challenges in its first year.

The NJ teacher evaluation pilot began in fall 2011, expanded in summer 2012, and is currently underway in 24 districts across the state.  The pilot program was designed to allow flexibility to districts to experiment with evaluation systems.   The year-one report is informing the statewide rollout of a new evaluation framework in the 2013-14 school year.
The Rutgers research team worked to gather data over the last year – utilizing surveys, focus groups, interviews, and analysis of teacher observation data provided by the pilot districts.  The first-year report on the NJ teacher evaluation pilot program provides information on the implementation of the pilot and stakeholders’ response to it – specifically, the extent and quality of implementation, barriers and successes, and comparison with previous evaluation systems.
“The purpose of the pilot program was to help the state of New Jersey understand what it would take to put more rigorous teacher evaluation into practice.  It is clearly meeting that goal,” said Dr. William Firestone, Professor of Education Policy at Rutgers Graduate School of Education and Principal Investigator of the contract.   “Between our findings and what Department staff have learned directly, State Government is now better positioned to help and direct the districts as they move forward.” 
The report concludes that the first year of the pilot program was a successful learning year.  All of the districts were able to effectively implement an observation framework and participating audiences expressed an understanding of the value of effective evaluation tools. Because districts got a late start, they were unable to collect enough data, however, to permit a firm assessment of the validity of those observations.
The year 1 report from Rutgers does indicate areas for improvement.  The researchers’ primary area of concern is the amount of personnel time required to successfully complete the required observations and ensure their validity.  The report indicates that one challenge is to ensure that administrators have the time to conduct accurate, helpful observations. Another is to ensure that both administrators and teachers receive effective training on the new observation protocols. An additional concern is the quality of communication with teachers.  Better and more communication may result in increased understanding among this primary constituency. 
"I'd like to thank Dr. Firestone and the entire RUGSE team for all of their work as we create a more fair, consistent, and learning-centered teacher evaluation system. We have already taken a number of steps to address some of the challenges of time and training that they have identified," said New Jersey Department of Education Commissioner Christopher Cerf. "Stakeholders across New Jersey have demonstrated a commitment over the past several years to continuously learn about and improve upon our work in public education. This is just one of the many reasons why our state is a model for the country."
The Rutgers research team will continue to present interim reports that will help the New Jersey Department of Education refine the design of its teacher evaluation program.  The evaluation team will also present a final report at the end of the contract.