<p>
</p>
<p>Did he say the average SAT for legacy admits was within 2 points of the average SAT for all admits—or within 2 points of the average SAT for the entering class? The way you phrase it makes it sound like the latter, and if so, that could be a very misleading statement. At most elite colleges and universities the average stats of the admit pool are significantly higher than the average stats of enrolled students, because many of those at the top of the admit pool decide to go elsewhere while those lower down in the pecking order among admits tend either to have fewer choices or are more likely to decide they’re “lucky” to be admitted and quickly accept. (I don’t know whether this is as much true at Harvard, though, which seems to be top dog in most of the cross-admit contests).</p>
<p>
</p>
<p>I don’t see how that follows. Doesn’t it depend on the size of the applicant pool, and how many well qualified applicants there are for each available slot? If the applicant pool is big enough, you can do a lot of things with it. Suppose College X has 1,000 places to fill, and gets 30,000 applications of which 2,000 are from legacies. Now suppose further the stats profiles of the legacy and non-legacy applicants are identical, and that the school’s yields on legacy and non-legacy admits are identical, in each case 50%. Then it’s easy enough for the school to fill 200 places in its entering class by admitting 400 legacies with stats exactly mirroring the overall admit pool (of whom 200 enroll), for a 20% legacy admit rate; while filling the remaining 800 places by admitting 1600 non-legacies whose stats also exactly mirror the overall admit pool, for a 5.7% non-legacy admit rate. Ex hypothesis same stats at both the applicant pool, admit pool, and enrolled student levels, but very different admit rates, HUGE advantage for legacies. Unequal treatment, one might say. </p>
<p>In my view this whole debate has nothing to do with whether unqualified or less-qualified legacies are being admitted. It has to do with whether there’s any real justification, as a matter of social policy, for awarding a legacy preference. Colleges are legally entitled to do it, of course. But SHOULD they do it, and if so, why? From the vantage point of the merely EQUALLY qualified non-legacy whose chances of admission may be half (or in my example, less than half) those of an otherwise identically qualified and credentialed legacy candidate, it seems manifestly unfair. And it’s hard to see any socially valuable end served by it, other than perpetuation of a hereditary elite which in my book is a social goal of dubious value.</p>
<p>
</p>
<p>dadx, maybe I’m missing something, but I don’t see the humor. Isn’t it the case that, by definition, legacies are almost invariably people whose parents have college degrees–making that a relevant comparison group? And isn’t it also the case that an extraordinarily large percentage of graduates of elite colleges like Duke—perhaps on the order of 85%—end up with graduate or professional degrees?</p>
<p>Again, I think this “lower average SAT scores” argument is a bit of a red herring. For reasons posted above, I think a legacy preference is unfair to non-legacies even if the SAT profiles of the two groups are identical, because it leads to unequal admit rates in a way that is not rationally related to the achievement of any legitimate educational or social objective. And it’s not just that legacies benefit; non-legacies are affirmative harmed because of the zero-sum nature of the admissions process.</p>