<p>I don’t know about a table for colleges but I think schools should track it for themselves and future students. When D chose to take an AP course with a teacher older S didn’t think was very good, I asked the principal, by email, what % of kids in the class over the years chose to take and then got a 3+ on the corresponding AP exam (she is the only teacher for this AP class). He actually didn’t know, and I think that’s data that would be useful. </p>
<p>D took the course anyway for various reasons. At parent night (we follow our kids’ schedule for like 15 minutes per class one evening before school begins), I asked, and the teacher did tell us, and that was good info to have going in. </p>
<p>I’ll ask it again of her AP teachers next year.</p>
<p>Even if one does not need to take the SAT subject tests, knowing if your high school is good can let you know if its courses will prepare you adequately for college, or if you will have to supplement on your own to avoid needing to take remedial courses in college.</p>
<p>It is a common mantra around here that one should take the AP course before taking the SAT subject test. This sounds like overkill, given that the SAT subject tests cover high school level material, not AP or college frosh level material. This theoretically means that a student who gets an A in the regular high school course should have no problem getting a high score on the corresponding SAT subject test (my own experience decades ago matched that, scores over 700 on three Achievement tests (as SAT subject tests were called then) based on regular or honors high school courses at a school where about a third of graduates went to four year colleges). But it appears that the mantra that one should take the AP course before taking the SAT subject test indicates that people around here do not trust the quality of their high schools’ courses, even though the demographic on these forums is predominantly upper income with access to the best public and private high schools where nearly all graduates go to four year colleges.</p>
<p>Publishing the correspondences between grades and SAT subject and AP scores would shed some light on whether the mistrust in the quality of a given high school’s courses is justified, although it may be less practical with SAT subject tests if few students take them.</p>
<p>School dministrators don’t want the public know that their schools are weak. Only administrators of few strong schools want to show off. All school websites show beautiful pictures and positive news.</p>
<p>DS’ private HS does in fact publish grade and score stats in its “school profile” summary, sent with guidance counselor’s packet accompanying students’ college applications, noting distribution of grades and averaged scores.</p>
<p>Teaching to the test would be welcome in my book. Just depends on what test we’re talking about. I’ll propose a Strunk and White test that will include the idea that “try and” is not a proper expression. If you miss that question, you’re disqualified from journalism and public speaking.</p>
<p>Actually, the proposal is to expose grade inflation as it exists, perhaps deterring more of it. If students getting A grades struggle to get 500 scores on the SAT subject test (or 3 on the AP test after an AP course) then that is existing grade inflation that needs to be exposed to students and parents.</p>
<p>As far as teaching to the test goes, AP courses explicitly teach to the AP test anyway.</p>
<p>True, and the reason I wanted to know what the pass rate was for that course. D got very good at writing FRQs and DBQs, the latter being somewhat useful if superficial analytical writing, but I am not sure if she became a better writer or developed a deep understanding of European history last year. But if the goal was to get a college credit and have “rigorous courses”, mission accomplished.</p>
<p>Our HS doesn’t offer many AP science classes, preferring instead of to offer “Chem II”, Physics II", etc. They do say most kids take and pass the APs in those subjects but they prefer NOT to teach to the test in those cases. Actually I’m going to ask what the difference is at parent’s night this fall :D</p>
<p>I like UCB’s idea, but it would produce a lot of data that I’m not sure admissions officers would know what to do with. Here is simpler idea. For a high school it should be possible to estimate a regression</p>
<p>GPA = a + b*composite<em>SAT</em>score</p>
<p>or to show what the average SAT scores are for students with GPAs are in certain bands,
say 2.5-3.0, 3.0-3.5, 3.5-4.0 .</p>
<p>You could look at SAT scores conditional on GPAs to compare grading standards across schools.</p>
<p>It is nice of them to make this information available (see page 8), but it does indicate that the district is not doing a particularly good job (for A students in AP courses, the modal score was 1; this is true in the art/music, English, science, and social studies subsets of AP courses/tests; in foreign language the modal score for A and B students was 4, while in math the modal score for A students was 5 but the modal score for B students was 1).</p>
<p>Perhaps many schools or districts consider this information too embarrassing to show to the general public.</p>
<p>No, I don’t think it’s particularly useful and I don’t think it should be published. Maybe a percentage of how many people pass (3 or above) would be useful by grade, but I just think there are so many variables that the data wouldn’t be particularly useful. For example, when I took my last AP test 4 years ago, the tests were $100 a pop for us (we had to test off site). That can quickly add up so those who might not pass the test or can’t afford the fee won’t take it, skewing data. Many strong students took the AP Gov test after taking just regular gov whereas weaker students wen on to take the actual AP class, this again will skew data. Then you have teacher variability. We have several teachers that teach AP Calc, AP Stats, etc. Each one could have very different outcomes and grouping all together wouldn’t be particularly helpful. </p>
<p>FWIW, even at my extremely large public high school, students knew which teachers were going to prepare you well for AP exams and which ones weren’t.</p>
<p>Teacher variability in grading standards is a reason to publish more detailed information about AP test scores and grade distributions. A student who earns a</p>
<p>B+ in AP calculus and a 4 on the BC exam in a class where B is the median grade</p>
<p>is likely a stronger student than one who earns an </p>
<p>A in AP calculus and a 2 on the BC exam in a class where A- is the median grade</p>
<p>I’m not so sure how helpful this information would be for a parent. I suspect the N may be too small for many schools so that the average by course grade could be skewed by a single person especially when not all students will take the SAT II exam. Also, for smaller schools where some classes have few students, the cell size could just be too small to report the data.</p>