AP

Students taking AP classes in Mo., or not


My Humps—Calculus

Students in Missouri are not taking AP courses at nearly the same levels as in other states according to the College Board annual report released earlier this week (Wed. Feb. 13). In Mo. 10.6 percent of students take an AP class versus 24.9 percent national average. We're at less than half the national average.

Nationally, 15.7 percent of students earn a 3 or higher on at least one AP exam; whereas, in Mo. only 6.7 percent do. In fact, Mo ranks 46th. (Yeah, the College Board recommends against ranking for a lot of valid reasons, but I did I did it anyway.)

Last summer, Mo. DESE sent out a press release praising the uptick in numbers of students taking the exam.

❝This year we sent more money to Missouri classrooms than ever before and also secured funding to encourage even more students to take AP classes, including training for more AP teachers and assistance to help cover the cost of AP exams. It is clear by our students’ outstanding performance that our investments are helping our students prepare for the challenges ahead,❞ he [Gov. Matt Blunt] said.



However, the 2.0 percent increase in the past 5 years is quite a bit less than the national average of 3.5 percent increase. Our students are falling behind.

It's possible students here are taking AP courses but not the exam. Adding in IB classes wouldn't raise the rates much since only a few high schools here offer them (Lindbergh, Metro). However, I wonder if St. Louis University's 1-8-1-8 program decreases students motivation to take the AP exam.

I couldn't find numbers on students in Mo. taking AP courses, but I did look up a few districts' offerings to compare to the national average (9). Clayton offers 21 AP courses including Calculus BC, Music theory and Macroeconomics. Hazelwood offers 15 including Computer Science and Physics. I also looked up a rural district and chose DeSoto at random. I couldn't find evidence they offered any AP courses. I didn't see any listed in the course schedule (except possibly calculus); nor were any mentioned in the student handbook. They are proposing a college prep certificate starting class of 2010. If that is reflective of rural districts, Mo. is in trouble.

Missouri has started two centers at Truman and SeMo to help train teachers to teach AP courses. This is a good start but not enough.

But, but, why isn't MYYYYY school on the Best Schools List?

We are a country obsessed with rankings―more precisely, we are a country obsessed with being #1 although not necessarily the work it takes to get there. No surprise that U.S. New & World Report has come out with its best high schools list in competition with Newsweek's best high schools list. The two use different criteria and, therefore, list different schools.

The Newsweek list, compiled by Jay Mathews, uses the number of AP and IB exams a high school gives as its sole criteria. The idea is that students can improve their academics by taking AP courses. The idea is that any school can improve its rankings by encouraging more students to take the exams. While the list has its faults, the criterion is easy to understand. In St. Louis the better high schools are ranked higher, so it seems to make sense.

The new US News & World Report list though is different. The methodology is complex. A high school has to have its economically disadvantaged and minority populations outperform the state average on state tests. The focus here is on the achievement gap, which I will post on another time. If it passes this criteria, a college readiness score is given based on AP tests, both # given and average scores. High ranking schools are then slated into three categories: gold, silver and bronze.

While I appreciate the attempt to measure multiple factors and applaud overperforming schools, I have several concerns about this method of ranking:

Mixing of methods. Since the bronze high schools don't include college readiness scores, they should be treated as a separate ranking. When I looked up the state proficiency scores for the Missouri bronze scores, they were often fairly low. Most parents aren't going to read complicated methodology papers (pdf) demonstrating that these aren't actually the "best" but overperforming in some statistical manner.

Mismatch between audience and methodology. Jay Mathews explained this one well:

❝Our focus is not what works for policy makers but what is most useful for readers, particularly parents, trying to judge the quality of their local schools and others that might be available to them.❞



The overperforming schools list would be better given in
Phi Delta Kappan than a general magazine.

Too many criteria. Yes, I know what is best is determined by what priorities someone has. That is why the Newsweek list works―people know what is being measured. If Andy Rotherham wants to focus on broader criteria, multiple lists would be more helpful for parents. I enjoyed clicking through the top magnet schools, top open enrollment etc.

Regional lists. If you don't live on the coasts or in other large states, you must not have any good schools―at least according to people who do these rankings. I don't know how the numbers work that way, but those of us in flyover country are used to being overlooked, but we have excellent college prep public high schools too.

In Missouri only Metro (St. Louis city's gifted magnet school) and Rock Bridge (suburban Columbia) made the silver list. No one disputes that these two are great high schools, and of course Metro has great numbers as it's selective, but I contend that some of the St. Louis county schools are as strong as Rock Bridge.

27 Missouri high schools are on the bronze list, but none of them are in St. Louis county. These are in the overperforming for their demographics but not necessarily strong schools category. I'll make a separate post with detailed numbers since this one is already long.

I think most parents would prefer a clearer, easier to understand list that actually gives usable information. Here's to modifications for next year!