EdNews Signup
A+ Education Partnership


Interpreting Your AP Instructional Planning Report

September 8, 2010 | Tags: rigorous standards & expectations, effective teaching, advanced placement, guest blogger
Bookmark and Share
By Dixie Ross
Guest Blogger

Advanced Placement Instructional Planning Reports have just arrived at our schools. Every AP teacher needs to get a copy of their report so that they can identify strengths and weaknesses in their curriculum and instruction. You should be able to get your report from your campus AP Coordinator. In this post, I will discuss a simple metric that should allow you to effectively analyze your results in order to improve future students’ performance.

Turn the report to the back side where global and group means are listed. Global is the average score for pretty much everyone in the world who took that AP exam and group is your own group of students. Note that if two teachers teach the same AP class at your school, the AP coordinator can code students’ answer documents so that each teacher can get a report for their own students.

The problem for those of us who have very inclusive AP programs in schools with a large low income population is that our group means will probably be consistently below the global means. This is not a reflection of poor teaching, but simply the fact that our testing group is not typical of AP students across the country. To better make use of the data on the planning report, divide each of your group means by their corresponding global means and then compare across categories.

I will use some of my own data as examples. For my AB Calculus students, they scored 83% of the global mean on the free response and only 63% of the global mean on the multiple choice portion. It’s the same students being tested over the same material! Why the difference? I suspect that we practice more on the free response portion of the exam and my experience as an exam reader gives my students an edge on that portion. I obviously need to give more practice on multiple choice questions. If the numbers had been reversed (with higher percentage on multiple choice than free response), then I would think that my students know the material, but are not communicating their knowledge (showing their work!) clearly enough on the free response.

Next, I compare percentages of various types of questions on each portion. How did the students do on differential versus integral calculus? A discrepancy here might show that I need to re-align the amount of time spent on those topics. In English, you might compare prose performance to poetry. For calculus, there is a part A (no calculator) that tests students ability to perform differentiation and integration by hand, amongst other things. Part B (with calculator) has 6 questions that require use of calculator and 11 others that are more conceptual in nature. My students do well on the Part A questions which indicates that they have their basic skills down well and on the calculator active questions since I cover and practice the 4 calculator functionalities extensively. I can see that the Part B percentage is lower, indicating that I need to improve my students’ conceptual understanding of the material.

Next, I turn my attention to the 6 free response questions. My kids rocked the motion and area/volume questions! That was no surprise. I can see they really bombed question #3 though on modeling profit. You might say that was just a hard question and that kids all over the nation did poorly on that question. True, but my kids only scored 42% of the global mean compared to 96% on the motion problem. If they could score well on question #1, they should have been able to score well on question #3. I now have to admit that there was something lacking in my instruction on this question and I now know where to focus my efforts for this year. I need to seek out those types of questions and make sure that I include them in my curriculum. I need to attend workshops that might address this type of question. I will look at other calculus teachers’ reports and find someone who did particularly well on that same question and find out what they might be doing that I am not doing.

As a lead teacher, one of my responsibilities is planning student prep sessions. These take place on a Saturday and students will get to see 4 different calculus teachers making presentations. How will I assign topics? By seeing what each teacher is particularly good at teaching by examining their Instructional Planning Report. I know I am good at motion and area/volume. My cohort at another school is particularly good at series. Now all of the students in the district can benefit from our strengths and we can sit in on sessions for topics that we know we need to work on.

This method of looking at the means seems to work well for all subject areas. It gets teachers to focus less on seeing that their group means might trail the global means and more on thinking about areas in which they need to improve. Being a lead teacher is all about helping other teachers to improve student performance while not limiting student participation. I am proud that I have an inclusive and diverse group of students taking my AP class. I can accept having lower group means, but I still want to find ways to improve. The Instructional Planning Report gives me much needed feedback every year in order to make that a reality.

Dixie Ross is a classroom teacher at Pflugerville High School, Pflugerville, TX, and has over 26 years of teaching experience in all levels of mathematics from remedial to AP Calculus BC. She began working as a College Board consultant in 1994. You can find curriculum modules she has written for AP Calculus and articles on vertical teaming posted at AP Central. She writes the AP Lead Teacher’s Blog.

A+ Alabama Best Practices Center A+ College Ready