How do top scorers on tests fail to gain admission to top schools?

<p>//"Well, if nothing else, the two UC studies do show one thing: neither grades nor test scores are a “strong predictor.” //</p>

<p>Again, given the above, neither would be, in and by itself – as a mark, as a number. That certainly doesn’t surprise me. In fact, I’ve seen a lot of students who were even ELC for UC, at a high-rent public, become overwhelmed at U.C. Berkeley, for example, with the demands of the courses, and in some cases are struggling.</p>

<p>marite, kluge,</p>

<p>While it may be counterintuitive that GPA is a better predictor than grades for 1st year college GPA, that is indeed the case. The CB’s own research on the validity of the SAT has shown this for years. You can probably find the studies on the CB website if you hunt for them.</p>

<p>Regarding the correlation between SES and either SAT scores or HSGPA, as others have pointed out indirectly, SAT scores and HSGPA are very different beasties. One is a standard measure (obviously) for everyone, where each is measured the same way. The other is a conglomeration (arithmetic average, really) of the more or less subjective measurements of a number of individuals, further influenced by a good deal of individual student decision regarding what courses to take and such. UC removes some of this variability by limiting the kind of courses included in the calculation of a UC GPA for admissins purposes. That is the number they used. </p>

<p>While Marite may believe that SES and GPA are correlated, and Marite may believe Kluge “proved” this, I don’t see such evidence. And on a state wide basis, I’m not surprised that this supposed correlation disappears. Does Marite really believe Compton schools have lower GPAs just because Compton kids are lower SES? I suspect not. </p>

<p>Keep in mind, too that both in CA and nationally, the dataset used, kids that matriculate, is a filtered one - only kids that are admitted to college are part of the dataset.</p>

<p>So, Marite and Kluge may be entirely correct if one considers the entire HS cohort, not just those that are accepted to college. But we will never know, at least for UC, because UC does not take all kids (and hence the range restriction mentioned earlier, too), so we can’t do the needed analysis.</p>

<p>So I grant you that it may be reasonable to hypothesize that SAT should be more predictive of college GPA than HSGPA, if all kids were considered. But the experiment will never take place. So we will never know. But it could explain why so many colleges continue to use the SAT in spite of its weak statistical predictive ability, and weak validation on the data set available?</p>

<p>No, I do not believe that students from one school have lower GPAs than students from another school. What I do believe is that not all 4.00s have the same value. We know that a 4.00 in a college prep class does not measure the same achievement as a 4.00 in an honors class in the same subject, at the same grade.<br>
And the enormous variability of HSGPAs is precisely why the SAT continues to be used. Not because of its predictive ability, but merely as validation. Given this concern, I wonder why HSGPAs are accepted tels quels, without being manipulated in similar wauys to the SAT (e.g. accounting for SES or some other factor).</p>

<p>No, I do not believe that students from one school have lower GPAs than students from another school. What I do believe is that not all 4.00s have the same value. We know that a 4.00 in a college prep class does not measure the same achievement as a 4.00 in an honors class in the same subject, at the same grade.<br>
And the enormous variability of HSGPAs is precisely why the SAT continues to be used. Not because of its predictive ability, but merely as validation. Given this concern, I wonder why HSGPAs are accepted tels quels, without being manipulated in similar ways to the SAT (e.g. accounting for SES or some other factor).
I am not in the least in love with the SAT, SATII or AP exams. To me, they all have enormous flaws. I am not advocating that they should be used in lieu of GPAs. But it seems to me that the studies that have been cited do not look at GPAs with the same critical eye as they do SAT scores. Of course, this has to do with the agenda of both the UCs (wanting to ditch the SAT) and the CB (not interested in measuring things other than its own products). We just need to keep that in mind.</p>

<p>

</p>

<p>Aren’t college using the SAT as PART of the admission puzzle? Isn’t it so that GPA + SAT is a stronger predictor than … GPA alone?</p>

<p>Regarding the California study, and even if all the “validation” gyrations were true, it would still mean little for Califonian non-UC schools, let alone schools outside California. The data culled by the researchers slaving to justify the decisions made by the UC system do not come from the entire population of high school students in California since the eligibility rules restrict the candidates. Further, despite that some believe the high schools in California to be a homogeneous example for the rest of the country, the differences in achievement among students remains glaring. </p>

<p>For what it is worth, we should remember that the AVERAGE HSGPA is 4.14 at UCLA and just below 4.0 at Cal. An average of 4.14 requires TWO AP and a perfect 4.0 GPA, or more with a few B’s. To put it mildly, a student does not accumulate a weighted GPA in the 4.4 to 4.67 (the maximum possible UC GPA) range without living in a culture and an environment that values and supports his or her efforts. </p>

<p>It takes a gigantic leap of faith to believe that students in East Los Angeles have the same pipeline to Cal or UCLA than students who attend public schools in San Mateo or Northern Santa Clara. And for that matter that students who attend low budget inner-city Catholic schools are in the same boat as students of Harvard Westlake, Castilleja School, or The Harker School.</p>

<p>PS Would it not be a shame that the GPA from students attending UC feeders was not a legitimate predictor at the … UC? After all, how long have people worked on the integration of K-16 in California? Indeed, it’d be a crying shame that a silly 4 hours test of middle school math and english competency would be a BETTER predictor for the Califonia flagship system.</p>

<p>Totally agree with that last paragraph, xig. There is not a lot of SES-diversity in the undergrad student bodies of the UC flagships (UCLA, Berkeley).</p>

<p>Could I sum up the recent discussion this way?</p>

<p>There are feeder schools to the UCs. They are more likely to be in affluent neighborhoods than in inner-city neighborhoods. They are pretty homogeneous. Therefore, there is no need to manipulate the HSGPAs. I can buy that.</p>

<p>marite, please just define what you mean by “manipulate the HSGPA.”</p>

<p>Actually, Marite, my intent in post 157 was to state the opposite of what you took from it: In California, in general, HSGPA is not affected by SES, because schools tend to be SES-homogeneous. That is, a poor kid in a poor neighborhood has just the same likelihood of getting 4.0 as a rich kid in a rich neighborhood. They may be getting a 4.0 in a less demanding curriculum, at a school which offers few or no honors or AP classes, but they still have a 4.0. So the distribution of grade point averages is essentially SES-neutral. A school like Berkeley High, with is SES heterogeneous, would be the exception to that. And, anecdotally at least, the GPA’s at Berkeley High are not SES-neutral; the children of University-affiliated and other high-SES parents reportedly do predominate among the high GPA students at that school, while children of blue collar workers tend to lag academically as a group. But there are so few schools like Berkeley High in the state that, as a whole, GPA’s are distributed fairly equally among different SES groups.</p>

<p>So what does that mean? Well, its not clear. Does the low level predictive effect reported in the study mean that only high HSGPA students who come from high SES schools with rich academic offerings follow that up with collegiate success? Or does it mean that a kid from a poor school district who has the ambition to get straight A’s, and the work ethic to achieve that goal, can also succeed in college despite not having taken calculus in high school? I don’t know the answer to that; I suspect it might be a little bit of both. UC will offer admission to the kid from the poor school district (there’s a specific added admissions “boost” for students from the lowest-performing high schools at the UC campuses with specific admissions formulas; I believe that much the same happens at Berkeley and UCLA, if less formally.) So the opposite of what you surmise is actually true: All else being equal, UC will extend an offer of admission to a student with a high grade point average from a low-performing school over a student from a top public high school with the same GPA.</p>

<p>I’m think of taking SES or race into account, as in the case of the SAT. There need not be a numeric coefficient, but some flag denoting at what kind of school a GPA was earned.</p>

<p>See, for example:

<br>
<a href=“http://www.tbf.org/indicators2004/education/indicators.asp?id=2614&crosscutID=324&crosscutName=Children%20and%20Youth[/url]”>http://www.tbf.org/indicators2004/education/indicators.asp?id=2614&crosscutID=324&crosscutName=Children%20and%20Youth&lt;/a&gt;&lt;/p&gt;

<p>Let me make it clear that I do not support one system of admission over another. I am just wondering about reliability of claims made in support of the utility of SAT, SAT II, HSGPA, number of APs per school, etc…</p>

<p>kluge, the predictive effect is actually not that low considering the heterogenous nature of the applicants and the outputs (i.e. majors and such).</p>

<p>Keep in mind too that the population that is part of the experiment is also highly skewed - you hardly find a wide range of HSGPA at the UC system. so it it pretty amazing to me, from a statistical point of view, that there’s still that much statistical power left. </p>

<p>I still don’t understand why folks here have such a hard time accepting the poor predictive power (and rather poor validity) of the SAT compared to HSGPA. Does it really make sense that a 4 hour test should be better than the views of teachers over 4 years?</p>

<p>It makes sense to me that a group of college-educated test designers could come up with a test that predicts how well I would do in college better than my high school teachers, who mostly went to worse colleges.</p>

<p>Newmassdad - I think the issue people have is that different teachers view different students and rank them by different criteria, making comparison of the resultant GPA’s somewhat of an “apples to oranges” issue. On the other hand, for all their limitations, the SAT and ACT are the same for everyone.</p>

<p>Here’s an example, based on a small data set, which may demonstrate what I’m talking about. I went back to the CSU site <a href=“http://www.asd.calstate.edu/performance/apr/0506/county07.shtml[/url]”>http://www.asd.calstate.edu/performance/apr/0506/county07.shtml&lt;/a&gt;
and compared two high schools in Contra Costa County to see how their graduates fared at the various CSU’s they attended. With the CSU’s you have a much greater variance in GPAs and test scores to work with. </p>

<p>Campolindo High in Moraga is located in an affluent suburb. Saint Mary’s College is located there, but it’s not really a college town - more of a commuter/bedroom community for high SES residents of the San Francisco Bay area. Many of its graduates attend selective colleges, but it sends many to CSU’s as well - in 2005, 49. Of those 49 students, the average HS GPA was 3.14, SATs 1220. The statewide averages for incoming CSU students are 3.29 and 1020, so what Campolindo’s CSU-bound students represent is a high SES group with SAT scores above the state average and HS GPA’s somewhat below the state average. Once they got to college, however, they outperformed the state average. Freshman GPAs averages 2.91, compared to statewide 2.79. (It’s even more of a differential than appears at first blush since almost half of Campolindo’s graduates went to Cal Poly SLO, by far the most selective and academically demanding CSU.) </p>

<p>Richmond High School is a low SES school in an inner city setting. Few of its graduates go on to college; 11 enrolled at various CSU’s in 2005. Those 11 students had HS GPAs averaging 3.55 - above the state CSU average, but SAT’s of about 920 - below the statewide average. Once in college they underperformed the state average, netting a combined 2.62 GPA vs. statewide average 2.79.</p>

<p>Again, this is a small sampling (I don’t know the general SES of every high school in the state, nor do I have the time.) But I suspect that the pattern probably holds up overall. And that indicates to me that in this sampling test scores correlate more strongly with freshman collegiate success better than GPA.</p>

<p>It’s no surprise that high SES correlates with college success; what is masked in the UC studies, I believe, is that HS GPA for comparable students correlates inversely with SES - or to put it another way, it’s harder to get good grades in high SES schools than in low-SES schools.</p>

<p>

</p>

<p>And isn’t it why Ivy Success Richard Shaw (n</p>

<p>

</p>

<p>I can understand where folks would feel this way. But the strength of such “subjective” assessments comes from the fact that many different teachers assess, each in their own way. But the collective result of whatever the teachers are doing turns out to be a pretty good predictor of what goes on in college. This does not surprise me.</p>

<p>However, back to the point I mentioned earlier, we actually do not know if either HSGPA or SAT scores (or anything else, for that matter) are good predictors of college success. We may intuit value, but the data is NOT there. Why? Because the so-called validation has been done with kids who are MATRICULATED in colleges.</p>

<p>In other words, all we know is that IF a kid is accepted, THEN HSGPA or SAT score may predict (with lousy accuracy, Kluge?) first year grades. We actually do not know if either metric is good at selecting kids to be admitted, because we don’t know how the kids that were rejected would have done.</p>

<p>Curious, no?</p>

<p>

</p>

<p>That’s an important methodological point. To be thorough about such investigations, one would indeed have to look at students who are rejected for admission on various grounds and see which of those grounds makes the most sense for that college.</p>

<p>Newmassdad, I think the point you make is valid for all selective colleges and universities - they all reject or are rejected by a group of students larger than the class they enroll from which a class could be selected which would probably perform just as well as the one which is admitted. This fact has been widely publicized about Ivy League schools, but I suspect that if Berkeley and UCLA selected classes, threw out that bunch, and selected a new class from the previous rejects, no one would notice. That’s not true of the UC system as a whole, however. Every California student who wants to attend UC and meets the (pretty easy) minimum standards has a seat available to them. It’s just that many students don’t want to attend the less selective campuses, and go elsewhere.</p>

<p>Going back to the CSU data, I quickly flipped through the pages of a couple dozen high schools, and found a very high correlation between above average test scores and above-average freshman GPA (actually, a 100% correlation. It was true in every case.) I did not find such a correlation for grades. The results varied, with a majority appearing to correlate inversely - that is, there are more high schools with above average HS GPA’s and below average college GPAs than there are with above average HS GPA’s and above average college GPA’s. The few that do correlate have high test scores as well, of course.</p>

<p>Anyone with some time on their hands might be interested in going to the site I linked to above and performing a more thorough sampling than I did, just to make sure I didn’t accidentally hit two dozen exceptions to the rule.</p>

<p>NMS:</p>

<p>I agree about the methodological issue. This is especially true, as Kluge observes, since most highly selective schools, whether public or privates reject more “qualified” than they admit.</p>

<p>

</p>

<p>That would ONLY be true if all HSGPA were created equal. How does one compare a GPA from Andover with a GPA from Brokeback Mountain High? How does one compare an unweighted GPA on a 100 scale with an A at 93 to the weighted GPA’s in the Carolina States? How does one factor the impact of weights that boost grades by as much as 2 full grades? </p>

<p>Actually, the studies and statistics produced by the UC system could be Exhibit A to anyone making a case that HSGPA SHOULD be validated against national tests. The correlation of Californian HSGPA and standardized tests for classes of freshmen who average more than 4.1 and are almost all in the top 10% of their class is quite telling. Too bad it does tell more about grade inflation than anything else!</p>

<p>xiggi,</p>

<p>The answer to your question is that we ignore the problem you pose. But that is because we’re talking about populaitons - many different measurements from many different places. So differences like you posit get buried in the noise/or statistical error.</p>

<p>FWIW, it is not much different for the SAT. How do you compare a single sitting uncoached score to a 4th sitting kid who had extensive personal coaching? Answer: you don’t. </p>

<p>Kluge, your approach is interesting, but methodologically flawed. You can’t do such comparisons with populations and then extrapolate to individuals. Nor can you compare populations as you did without controlling for a lot of other factors.</p>

<p>To me the sad part of this discussion is that much interesting data is no doubt out there, but not accessable to us lay folks. I fear a lot of selective use of data to meet PC agendas and such, much as kluge sees such behaviours in the earlier study.</p>