<p>"Scientists at the Massachusetts Institute of Technology helped invent radar, high-definition television, computer memory and the Black-Scholes model for pricing stock options. Its faculty and staff include 20 MacArthur Foundation “Genius Grant” recipients.</p>
<p>But for some time, MIT now says, it wasn’t properly calculating the average SAT scores of its freshmen.</p>
<p>Those scores are closely scrutinized as a barometer of college quality. They are part of the formula used by U.S. News & World Report’s influential …"
THE FULL WSJ.com ARTICLE IS ONLY AVAILABLE TO SUBSCRIBERS.</p>
<p>Can’t help but wonder if former Director of Admissions Marilee Jones was the one that chose not to report complete information in prior years. ;)</p>
<p>There is an email list for institutional research officers of colleges who respond to Common Data Set queries, so these questions about how to report data can be asked and answered at any time all year round. The data definitions are clear, so MIT shouldn’t have made this mistake, but I suspect other “peer” colleges have made it too, especially as to how to report scores of students who take both the ACT and the SAT. </p>
<p>I hope everyone notices that the U.S. News summary statistics themselves have a mistake, because they ADD the bottom-quartile level of critical reading section scores to the bottom-quartile level of math scores to get a presumed (and quite possibly wrong) bottom-quartile level of CR-plus-math scores for each college. That’s wrong because the same enrolled students may not be in each bottom quartile. IF, and only if, the colleges actually reported interquartile ranges for combined CR-plus-math scores, that figure could be passed on directly by U.S. News, but most (all?) colleges don’t. </p>
<p>By NACAC rules, colleges are to report only interquartile ranges for admission scores and not median scores. A few colleges get caught disregarding that principle of good practice from time to time. A lot of colleges practice the how-to-lie-with-statistics press release technique of announcing scores for their ADMITTED classes right after admission season culminates in May, but the Common Data Set requests scores of ENROLLED students, which for all but the few colleges with the highest yields and most desirability will tend to be lower than the scores of the admitted students.</p>
<p>I wonder why the US News and Common Data Set want to use test scores that were not even used for admission consideration by the school? That doesn’t make any sense.</p>
<p>Virtually all schools use the highest score an applicant reports for admission consideration, which seems like the way an applicant would want to be treated. The only exception I’ve heard of is some Big 10 schools which use all the scores from the same sitting of the SAT (no splitting of math and verbal from different sittings).</p>
<p>MIT’s admissions committee is supposed to be identifying the applicants who can excel in the MIT program, get maximum benefit from it, and fit into MIT’s culture and mission, and hopefully not to just optimize it’s US News ranking.</p>
<p>This is more evidence of why the US News ranking is bogus as far as measuring the quality of the education you will receive at an institution. It’s main purpose is to sell magazines.</p>
<p>If you are a college applicant, the best thing you can do is attend a typical Freshman class at the schools you are considering. You’ll know how to “rank” the schools on your list for yourself. Too bad printing such advice doesn’t sell magazines.</p>