鶹Ƶ

Skip to main content

FOR IMMEDIATE RELEASE

February 26, 2015
For More Information Contact:

JP O'Hare

(518) 474-1201

Press@nysed.gov

鶹Ƶ Seal

2013-2014 Final School and District Evaluation Results Released

The 鶹Ƶ today released the final results of the second year of statewide teacher and principal evaluation. These data include district- and school-level results. The aggregate, preliminary statewide results were released in December 2014.

The final evaluation results show more than 95 percent of teachers statewide are rated effective (54 percent) or highly effective (42 percent); 4 percent are rated as developing; 1 percent are rated ineffective. Ninety-four percent of principals are rated effective (66 percent) or highly effective (28 percent).

“The ratings show there’s much more work to do to strengthen the evaluation system,” Board of Regents ChancellorMerrylH.Tischsaid. “There’s a real contrast between how our students are performing and how their teachers and principals are evaluated. The goal of theAPPRprocess is to identify exceptional teachers who can serve as mentors and role models, and identify struggling teachers to make sure they get the help they need to improve. The evaluation system we have now doesn’t do that. The ratings from districts don’t reflect the struggles our students face to achieve college and career readiness. State law must be changed to build an evaluation system that supports teaching and learning in classrooms across the State. Our students deserve no less.”

“Unfortunately, in far too many districts across the State, we see evaluation results that do not reveal the true performance of educators, and annual student growth goals that do not reflect rigorous expectations,” Acting Commissioner Beth Berlin said. “We have recommended common-sense changes to State law that will ensure evaluation results reflect true differences in teacher effectiveness and student growth.”

Berlin noted that New York City implemented an evaluation system for the first time in the 2013-14 school year and almost 70 percent of districts submitted material changes to theirAPPRplans since the 2012-13 school year, making year to year comparisons difficult. Berlin also noted that, in response to the Department’s Testing Transparency Reports issued July 1, more than 100 districts submitted changes to theirAPPRplans that reduced testing (see:).

Resultsdisaggregatedby district and bysubcomponentsvary significantly across the state. Some district evaluation systems provided a greater level of differentiation, which means that teachers and principals in these districts will benefit from more individualized professional feedback.

New York City, whose evaluation plan was imposed by former Commissioner King when the New York City Department of Education could not reach agreement on the terms of the evaluation plan with the teachers union, showed greater differentiation than most districts in the State. Although New York City teachers and principals were evaluated on the same overallsubcomponentsas the rest of the State, the threesubcomponentsused different scoring ranges to determine thesubcomponentrating categories (i.e., Highly Effective, Effective, Developing, Ineffective). Less than 10 percent of teachers in the city are rated Highly Effective, while 83 percent are rated Effective, 7 percent are Developing and 1 percent are Ineffective.

Tischand Berlin said the Governor and Legislature should leverage lessons learned from districts that have effectively differentiated performance and amend Education Law in three ways:

Require statewide scoring ranges for all three of thesubcomponents, as was done under the New York CityAPPRplan.
Eliminate the locally-selected measuressubcomponent. The data reveal that the local processes for assigning points and setting targets in thissubcomponentdo not differentiate performance. Elimination of thissubcomponentcould also reduce the number of local assessments students are required to take, addressing the most frequent parent concern with this State law.
Require that for a teacher to be rated “Effective” or better on the other comparable measures of student growth (also known as student learning objectives (SLOs), which are used for more than 80 percent of teachers who do not have State-provided growth scores), districts must set a rigorous target that a teacher’s students achieve at least one year of academic growth.

A number of districts showed similar or more differentiation than New York City, including a number of districts participating in Strengthening Teacher and Leader Effectiveness (STLE) grant program. The level of differentiation is significantly impacted by local collective bargaining agreements and the manner in which local measures and student learning growth targets were implemented.

Some districts showed less differentiation in 2013-14 than in 2012-13 when they made changes to their evaluation plans. Although all evaluation plans are reviewed by the Department and revised by districts prior to approval, differentiation in approved plans can still be limited by statutory allowance for local decision-making through collective bargaining. Although the Department has issued corrective action plans, those corrective action plans are prevented by statute from requiring changes that conflict with local collective bargaining agreements, even when the result is lack of differentiation.

The differences in ratings show there is much more work to be done. The Board of Regents has proposed allocating $80 million in the 2015-2016 State Budget to support the continuation of theSTLEgrants to help ensure that all teachers and principals across the state benefit from a strong evaluation system that supports student learning and personalized professional development.

The full results can be found at. Under the 2013-14"Available State Data", select Annual Professional Performance Review Ratings and State Provided Growth Ratings. Data are available under the “2013-14” tab at the State, County,BOCES, District and School level by using the search box or navigation tool bar.

-30-