<p>The upshot is that the percentage of top-decile students that GW has been reporting for a decade or more (78 percent) was inflated from the actual figure (58 percent).</p>
<p>When the Claremont McKenna SAT scandal broke last year, many folks (me included) said it was the tip of the iceberg. </p>
<p>The rampant cheating among high school students and colleges alike–added to the huge amounts of stress that the admission process creates for so many families–all point to huge flaws in this process.</p>
<p>The portion of the CDS that was “fudged” was percentage/top 10% class rank. But, apparently many (most?) HSs don’t report on class rank. So, what are colleges to do?</p>
<p>Good question. But I think they have to report data based on what they actually receive rather than on their own extrapolation. An asterisk next to the figures would come in handy. They can use it to explain the percentage of high schools that actually provided a rank.</p>
<p>If asterisks were added to all the “massaged” numbers, the reports would be full of them. The reality is that the organizations that rely on the CDS could not care less about the accuracy and integrity of the numbers reported. They allow schools such as Berkeley and Middlebury to obfuscate a substantial portion of their admits (winter admits) or use guesstimates for class ranks. Similar fabrications are blatantly allowed for the Fair Test babies that report bogus test scores at will.</p>
<p>And none of of those supposedly objective yardsticks match the level of deceit and gamesmanship that permeates the peer assessment surveys. </p>
<p>It will not stop until the big bosses at the school will accept responsibility for the surveys and make all reports public and verifiable.</p>
<p>Also, agree with Xiggi and you can add any SAT optional schools to that list. Applicants report SAT scores only if they are good and this allows those schools to inflate the actual SATs of their entering class by ignoring (not knowing about) the SATs of those who choose not to report.</p>
<p>There are, as usual, more opinions than facts here, which always makes me uncomfortable.</p>
<p>What exactly did GWU fudge? They made estimates, and since the article doesn’t say otherwise, I’d give them the benefit of the doubt and say they made reasonable estimates.</p>
<p>Our local high school provides a grade distribution bar chart, from which it’s possible to make a reasonable estimate of the upper decile GPA cutoff. I don’t think there’s anything wrong with any school taking this approach, and I think that excluding this makes the reported information LESS useful. </p>
<p>And I think USNews’s huffiness about this is self-serving nonsense. They could easily ask colleges to provide two numbers: % top decile among those reporting, and % estimated top decile (with some clear rules about the latter estimates).</p>
<p>I have always thought of class rank as not a useful data point, anyway. How do you compare the top 10% at a competitive high school versus the top 10% at a slacker high school, or how do you compare the relative rank for a senior class with 700 students versus a senior class of 40 students?
I agree with xiggi that peer assessment surveys are useless, and class rank data is close behind.</p>
<p>Line C10 of the CDS asks for the percentage of enrolled freshmen who submitted class rank. GW has updated their CDS to say that class rank %s have been revised as of Nov 8th. </p>
<p>“C10 Percent of total first-time, first-year (freshmen) students who submitted high school class rank: 38%<br>
C 10 Class Rank percentages are revised as of 11/08/2012”</p>
<p>ExStudent–our HS also provides a grade distribution chart, but looking at our most current example, the 10th percentage point falls 16 students into the 43 person 3.75 to 3.99 range. It is not possible to determine exactly where they 16th student will fall among the 43. Since GW has revised their percentage of top 10% students down from 78% to 58%, I would have to guess they were too liberal when extrapolating. (Of course, my guess is no more sound than their methods!)</p>
<p>Having said all of this, as Sally commented, I truly doubt they are alone, and at least they came forward.</p>
<p>Did anyone ever notice that whenever there is a mistake in reporting regarding items involved in the ranking, the mistake is ALWAYS in the favor of the school!</p>
<p>Wow!!! What a surprise!!! Do you think they are the only ones? (Please read these first sentences in a sarcastic tone). Anyone looking at the stats for many of the colleges should be leery and smell something fishy.</p>
<p>GW is trying to be a real estate/Ivy League/MIT conglomerate. They’re just not doing a very good job at it. They want to be a top-tier school (at least on US News rankings), and they so desperately want to prove it to everyone, that it gets laughable at times. At the same time, though, plenty of kids like GW the way it is right now, but it’s stuff like this this, the fudged rankings and the money tossed out the window on an ugly logo that gets even those guys riled up…</p>