EdNews Signup
A+ Education Partnership
 

RSSABPC Blog

A Principal's Struggle to Learn, Grow and Measure
April 10, 2014 | Comments
Bookmark and Share
 
We're pleased to share this article from the March issue of Learning Forward's bimonthly action brief Transform Professional Learning. Nathan Pitner, principal for Brookwood Forest Elementary (Mountain Brook), "explains his team’s efforts to evaluate the learning part of professional learning instead of traditional outputs. He finds the best success when teachers are guiding their own problems of practice."

 
by Nathan Pitner
 
In the past, when we evaluated our professional learning, we evaluated attitudes, the popularity of the professional learning, what it looked like, was it helpful, etc. Ultimately, though, I’m not sure we were really evaluating the effectiveness of the learning. That is the process we are carrying forward now – evaluating the learning and not the popularity. This is not easy to do, but we found the best success when teachers guide their own problems of practice.
 
Problems of practice target the misconception that teaching and learning are the same thing. This year we’ve taken a considerable step forward by having grade levels begin looking at problems of practice. Teams work backward from the evidence they hope to see in student learning to consider ways teachers need to alter their practice or teaching. We accept the correlation between our practice and student growth.
 
The format we’ve used attempts to streamline three questions: How will my students show evidence of their learning? How will my instruction or content change to help students produce this evidence? What skills or learning do I need to develop to implement this change?

We’ve loved the way problems of practice help us see students, content, and teachers in the same context of the learning process. Most importantly for our teachers, it separates what they need to do from what they need to know to do it. The first step of measuring professional learning is developing an idea of what learning we actually want to measure.
 
When we say we want teachers to be learners, it shifts the culture — how the faculty meets together, what they talk about and how they talk about it — and affects how we put a school improvement plan together and what we look for in student growth. Our school improvement plan is not generated by one or two people behind closed doors. Instead, the entire faculty is thinking about what they know and what they see are problems of practice each year. We will sit down with a design team and representatives of faculty teams and counselors and review everyone’s problems of practice. Teachers see their voices in the process.
 
Connect adult learning to student learning
Connecting the professional learning to student learning, figuring out what to measure, is the hard part. Problems of practice have been helpful because the teachers had a hand in determining what those measures would look like. I’m not sure we do an effective job when we measure professional learning in a vacuum. Instead, we try to measure the effects of professional learning through student learning and anecdotal evidence that we explore during our weekly learning community conversations.
 
We regularly video each other and watch the videos together. To me, this is the strongest evidence of a highly developed team. It takes vulnerability for teachers to honestly talk about their craft. It is not a judgment of good or bad teaching, but an opportunity to objectively see the high-leverage characteristics that we targeted. We believe true growth is dependent on people becoming comfortable in watching each other and analyzing their craft.
 
When teachers know what high leverage characteristics are and get practice, peer feedback, and coaching, that gives you something to measure. We can’t measure effectiveness if we don’t think through the problem backwards and determine the high-leverage characteristics.


 
Practically speaking, if your school doesn’t have a consistent or defined vision of what to accomplish, then you start to lose the momentum or morale to push one another to grow toward the established goal. In this way, we have been greatly impacted by the work of Michael Fullan and Andrew Hargreaves on the concept of social capital, or the group moving the group toward the goal. Not unlike so many schools, we have amazing teachers who want to accomplish amazing things. We have to help them answer the question of how.
 
As teachers develop a sense of the right thing to do, the administration has a responsibility to protect them in that process and build a learning community where people are open to making mistakes in front of others so they can grow. Often, educators will say they are open to growth, but the next step in that process is being willing to make mistakes. Teachers and administrators have to realize they are not always going to be perfect and accept that they are a valuable part of the process. Being open to everyone and leveraging the learning with each other is hard.
 
Define what to measure
Administrators are challenged when thinking of what can we measure and are we really measuring what we think we are? Like when measuring SMART goals, are we focused not just on the teacher or curriculum but also all the way through to students and content and what the teachers are doing? Maybe we assessed the success of an hour, but not beyond that hour. Or maybe we assessed the long-term effects but not the characteristics that led to growth. We have responsibilities to help teachers think through what they want to measure and how they will measure.
 
All that ties back into effectively measuring whether the learning is working with teachers. What we currently want to improve at my school is that point where teachers are honest in where they are in the process. We are honest in where we need to grow, but trying to balance anecdotal evidence and quantified student growth is difficult. People talk about the art of teaching and how that makes it hard to quantify, but that does not excuse making a good effort to do it.
 
Right now, we’ve developed a composite picture. There are so many things going into student learning, the idea of knowing your craft and honing it enough to point to variable high leverage factors is difficult. For a teacher, that is even more complicated. Students are constantly shifting. Everything is moving. Teams and students may be new. A group of students may look different in December than October. There are many moving parts, so there is a reluctance to attribute professional learning as a major factor in student growth, and there is difficulty in finding quantifiable measures. It can be hard for educators to link professional learning because of the reluctance to piece it all together, so it is my responsibility to help keep the vision clear.
 
To maintain the vision, we first don’t want our meetings to be measured by popularity. The reality is that every workshop is the product of previous feedback. When we talk about learning opportunities, we do want to get immediate feedback, such as what was helpful, what should we consider next time, where can teachers connect, etc. I don’t want to measure whether we did a good job talking to people. I want to measure whether we can apply what we talked about. We leave with anecdotal and targeted feedback, then we follow up with check-ins and conversations with coaches, with learning communities, and around student data throughout the process.
 
Our midpoint check-in is teachers getting back with coaches via videos and learning community conversations, looking at how it is being implemented, what it looks like, and what we see.
 
The third stage is student learning. Is it making a difference? What evidence do we have as outcomes?
 
The fourth part is being able to have debriefing conversations that tie correlations together. Was this result a product of this practice? Why do we think that?
 
From the first stage of forms of effectiveness to the middle stages comparing processes and student data to the last piece of debriefing with design teams and vertical representatives, we look at where we have learned, grown, and struggled.
 
What student learning products can we present? That part is harder. We are still growing in this process. We have multiple sources of feedback and records of design teams reflecting on what works and doesn’t work. It helps us to continue to build structure. Right now we are using a composite of hard and soft data to measure effectiveness, but our goal is to be more and more specific.
 
Going into our third year, as we take the next step, the learning gets deeper and deeper. Having the structure to shape conversations is important. The conversations are about learning and student results, and they correlate our craft with the results we are getting. We are still in the process of feeling good about our indicators. Much of what we do now is conversational and based on looking at student work and what we are doing in our practice. But when the teachers see that it is working, they learn from and with each other and the enthusiasm spreads like wildfire.
 
[“Evaluate the learning, not the popularity: A principal’s struggle to learn, grow, and measure” by Nathan Pitner, Transform Professional Learning, March 2014.]
 
Used with permission of Learning Forward. All rights reserved.
  [permalink for this post]
 
 
Good read: Learners need "toddler" freedom to find mastery
April 1, 2014 | Comments
Bookmark and Share
 
by Cathy Gassenheimer  On a warm summer evening, watching the crowd at a street festival, I was fascinated by a toddler who was learning to walk. He was moving enthusiastically and enjoying his progress. His method was quite comical, more arms than legs, propelling himself along as if he were rowing a boat. His parents watched with delight, realizing that form and grace didn't matter, but mastery did. They were beaming with pride. No one said, 'Bad Baby, you're not doing it right!"  No one tells... [continue reading]
 
 
Tarrant High Students Learn the Value of Scientific Research by Teaming Up with NASA and UAB
March 28, 2014 | Comments
Bookmark and Share
 
by John Norton ABPC communications consultant   If the Sunday forecast for Cape Canaveral was better, nine Tarrant High School students would personally watch as their biological science project lifts off for the International Space Station, inside a Falcon 9 SpaceX rocket.  Even though the weather prompted NASA to delay the resupply mission (which will deliver a 5000-lb payload of scientific experiments and supplies), the THS 11th graders are still making the trip to Florida. It's not the first... [continue reading]
 
 
Interview: How Concept-Based Teaching Can Deepen Student Learning
March 26, 2014 | Comments
Bookmark and Share
 
by Cathy Gassenheimer Lynn Erickson and Lois Lanning are the authors of an important new book for educators who are rethinking curriculum and instruction to meet higher standards and go deeper with students into knowledge, understanding and doing.After reading Transitioning to Concept-Based Curriculum and Instruction: How to Bring Content and Process Together, I was eager to have Lynn and Lois talk about their book and how it might help Alabama schools as we transition to more powerful teaching... [continue reading]
 
 
Close Reading: Do Leveled Texts "Lead to Leveled Lives"?
March 3, 2014 | Comments
Bookmark and Share
 
From The Marshall Memo (2/18/14). Summary by Kim Marshall. Used with permission.  Rethinking Small-Group Instruction with Informational Texts In (an) important article in The Reading Teacher, Douglas Fisher and Nancy Frey (University of San Diego) question the traditional model for small-group reading instruction: students working with leveled texts, with texts doing the scaffolding.  The problem, say Fisher and Frey, is that the criteria most schools have been using for instructional level don... [continue reading]
 
 
< Older Posts
 
 
 
 
A+ Alabama Best Practices Center A+ College Ready