I truly believe that all teachers want to help students to learn and succeed. They go into the profession with the goal of helping students to learn and to grow in their classrooms each school year. With this in mind, how do we help teachers to define effective instruction and how can we provide clear examples of highly effective practice?
This is the real work of teacher evaluation...it isn't the framework, the rubric, the tool, the growth goal or the accountability singularly that will get us to this place. As building leaders we need to be able to have a clear vision of effective instruction and to be able to help build this shared understanding with teachers. Getting all stakeholders on the same page with this vision is step one to building an effective educator evaluation system.
In my role as the Executive Director, I am fortunate to work with educators around the State and Nation. I have had numerous opportunities over the past year to learn various frameworks, see tools enacted and to discuss evaluation systems with hundreds of practitioners. Here are some words of advice and caution:
- The 2012-13 school year requires us to be in a holding pattern. All districts and schools need to have educator evaluation practices that are in accordance with the revised school code, specifically section 380.1249 but I would caution all districts about adopting a new framework or tool this school year.
WHY? The Michigan Council for Educator Effectiveness is conducting a pilot this school year. There are four frameworks being piloted and each one has a specific tool that has an exclusivity agreement:
- The 2011 Danielson Framework is paired with Teachscape
- Marzano's Framework is paired with I Observation
- The Five Dimensions of Teaching and Learning Framework is paired with GoObserve
- The Thoughtful Classroom Framework is paired with Stages
This means that these frameworks cannot be used in an electronic format with any other tool as it would be a copyright violation. Currently, many districts are using self developed frameworks based on frameworks that they have tweaked from research based models (many on Danielson 2007). These district frameworks have then been uploaded onto a variety of electronic tools. Likely, these self developed/modified frameworks will not be permitted to be used once the MCEE completes the pilot and makes their final recommendations, as we will likely be required to use one of these frameworks without any modification. In looking at other States who are ahead of Michigan in the development of evaluation processes, there is often a singular framework as the default model, and one or two others to choose from at the district expense. Regardless of which of these four models are selected, it will need to be used as developed without additional modification, hatchet are only research based in the form they are approved.
- Spend this school year focusing on developing consistent processes based on the framework you are currently using. Administrators and teachers need to develop a solid understanding of the framework and its "look fors" in the classroom. How will implementation of these look fors result in increased student achievement?
- Spend this school year revisiting district created common assessments. In most cases districts are using some form of state/national testing along with district developed assessments to measure student growth. The quality of the district/teacher developed assessments often needs to be improved. It is wise to spend professional development time ensuring that common assessments are aligned to the common core and/or ACT's college readiness standards. In addition to these summative tests, it is also important to assist staff with the development of interim assessments. Districts looking for options to measure student growth should read the MCEE interim report where the tools the MCEE are using for student growth measurement in the pilot are outlined.
- Educators need to take responsibility for their own evaluation. Teachers and Administrators need to understand the framework and how they are being evaluated. Once a clear understanding is established, data and artifact collection should be ongoing throughout the school year to document student and professional growth. The evaluation process needs to be very transparent so that there are no surprises regarding the final rating. Both the evaluator and the person being evaluated should come to the same conclusion on what that rating is based on the evidence collected throughout process.
We are still in a time of great transition in our State where we need to "go slow to go fast." There is important work that needs to be done this year as we define highly effective instruction and select effective measures of student growth. Additionally, the curricular shift to the common core is also an essential focus. As an Association, we will work to support building administrators in this work with up to date information via weblines and webposts and through high-quality, targeted professional development. This is not easy work and we want to support you and help to lighten your load.