See, NYT says 15x and people pull that one figure out and run with the spin. Omg, 15x! (In this case, I don’t mean you, mathyone.) Chronicle picks a number, too.
Here’s what Hurwitz wrote,maybe you saw. it’s the digest in various sources:
“In this paper, I examine the impact of legacy status on admissions decisions at 30 highly selective colleges and universities. Unlike other quantitative studies addressing this topic, I use conditional logistic regression with fixed effects for colleges to draw conclusions about the impact of legacy status on admissions odds. By doing so, I eliminate most sources of outcome bias by controlling for applicant characteristics that are constant across colleges and college characteristics that are constant across applicants. I estimate that the odds of admission are multiplied by a factor 3.13 due to legacy status. My results also suggest that the magnitude of this legacy admissions advantage depends greatly on the nature of the familial ties between the applicant and the outcome college, and, to a lesser extent, the selectivity of the outcome college and the applicant’s academic strength.”
Remember, he wrote it as a grad student in the school of education. While he can work with numbers, try to control for whatever, it’s an academic view, not based on experience with applications or how decisions are made, page by page.
@mathyone - I do not know of a free link to the Hurwitz study, but I have access through a subscription service.
You are right to be confused. The summaries that the newspapers have are definitely the “for dummies” version, and can be highly misleading if you try to use the results as summarized. Respectfully, even northwesty’s summary of the article distorts many of the key points that the author is making … the stuff about controlling for SAT scores etc. that northwesty writes is just wrong. I’d have to write a long post to explain the problem. In the meantime, I’d suggest that you just disregard it.
Here’s an example of how the math works if you are interested -
Hurwitz estimates that the odds that a “primary” legacy has for admissions are 7.63 times higher than the odds for a non-legacy.
When Hurwitz says “odds”, he’s referring to something called an “odds ratio”. This is not the same as “admission probability.” It’s like in horse racing … if a horse has a fair payoff of 3:1, then the odds ratio is 1/3 while the probability that the horse wins is 25%. Logistic regressions models use “odds” rather than probability because the models have much better properties.
Here’s an example. The average acceptance rate in the sample of 30 colleges for non-legacies is 19.2%. This gives an odds ratio of (.192)/(1-.192) = .237. Multiply this by 7.63 (primary legacy) gives an odds ratio of 1.81. Converting this back to acceptance rates gives 1.81/(1.81+1) = 64.4%. So 64.4% is (crudely) the average acceptance rate for primary legacies. Finally, 64.4% - 19.2% = 45.2%, which crudely gives the model’s prediction of the average advantage of a primary legacy over a non-legacy.
The 45.1% add-on factor for primary legacies' "admission chances" was an attempt to make it easy for casual readers to understand the results, but it would make a horrible statistical model. It's not a good way to summarize the results other than at the crudest level.
Respectfully, I don’t think it’s possible to have a useful discussion of the Hurwitz study in this thread because it would take a long time to get everyone up to speed since they don’t have access to the study. It would take a few long posts to clear up the confusion.
I also think there is a problem with the study that explains why his results are biased higher than they should be.
Regardless, I think it’s clear that the legacy advantage is pretty high. Denying this obvious fact seems foolish to me.
As someone who has served on an admissions committee for Ph.D. admissions at an elite school, I do not think it is useful or accurate to try to portray admissions as some sort of mystical process that only insiders can understand. The pursuit of rational inquiry is at the foundation of the modern university. Statistical summaries can definitely help shed some light on the admissions process.
I wasn’t saying that it was 15x everywhere. I’m just a bit puzzled as to how this study was able to measure a rate 15x as high. I guess there must be a school where a bunch of underqualified legacies who would never have a chance are getting autoadmits whereas their non-legacy poorly qualified counterparts are accepted at 6% or less. I would think this would show up in a break-out of SAT scores by legacy. Yes, I know it’s not all about SAT scores but I think a preference this dramatic would show up there and be easy to spot, just as one can see lower SAT scores for recruited athletes.
Might also show up in a disproportionately high fraction of legacies on campus if it’s a small school. Do most schools publish their percent of legacies in the student body?
Al2, statistical studies serve a purpose. But they do not capture the full process, they try to model it. I’m involved with undergrad and there are distinct qualitative differences among candidates, even among top performers. Unlike grad programs, this hinges on a self presentation (app/supp) from high school, before the real work has begun. The filter is different.
Mathyone, you can try to access it via Researchgate.
I agree that Ph.D. and undergrad admissions are very different. I also agree that statistics can mislead people if they naively apply the results to an individual student.
But I object to trying to dismiss the obvious strong advantage that these studies find that legacies have (on average) with an appeal to qualitative differences that are only apparent to the cognoscenti. You can very appropriately use statistical studies to quantify the advantage that legacies have, and there’s no doubt that it can be pretty big.
OP here, back to something I know as a statistician/epidemiologist. The 15X was probably an odds ratio misinterpreted as “…times as likely (probability)” rather than 15X the odds. If the baseline rate is high, for example, greater than 20%, the odds ratios look very high compared to the actual percentages. When the baseline rate is low, less than 10%, the odds ratio is very close to representing “times as likely.” I used to find this mistake made almost weekly in the research articles I review.
So Al2, you just want to look at this academically? (No disrespect.) Because, for a most selective, those qualitative differences can be huge factors in actual admissions decisions. Including among legacies and L vs non-L.
Analytically, yes, there is a legacy bump. But it’s not as simple as knee-jerk rubber stamping, at an elite. From a practical perspective, one can’t ignore the qualitative aspects and how any kid shows them or not.
Kids/families don’t need to be insiders to learn more about their college targets. The CC emphasis on stats, checking the CDS and then throwing you hat in the ring, does a disservice. See it all the time.
@al2simon, thanks for clarifying. So it sounds like the number many would care about is, in your example of the average data, the comparison of 64.4% admit rate for primary legacy to 19.2% overall admit rate, is 3.35-fold higher for primary legacies than other candidates who are, by Hurwitz’s criteria, equally well qualified.
A couple of points. I think it may matter a lot what 30 colleges were included in the study. We spend a lot of time talking about Harvard, Yale, and Princeton here, and the study, even if entirely accurate, may tell us little about them–even if they are part of the group being studied. In other words, the average advantage among this group of colleges is not really helpful at all, although the range is pretty interesting. (Note, for example, that the average admission rate across all 30 colleges was much, much higher than the rate at HYSP.)
My observation, admittedly anecdotal, tells me that many legacy admits (and rejects) are admitted to peer schools, or at least schools that aren’t much less selective. If the advantage was a “brick,” we’d expect to see a lot of legacy rejects enrolling at the state flagship. Maybe people who have that experience are less likely to report it here at CC.
Here’s another little thought experiment, based on the following:
What if there was a college that had only two admissions criteria: ACT score, and legacy status. It has 1000 slots. This year, it has 1000 applicants who have a score of 36, but who are not legacies, and 500 legacy applicants with scores of 36. Let’s say it takes all 500 legacies, and 500 non-legacies. Obviously, this shows a huge advantage to being a legacy, but is this a bad or unfair result? All of the applicants had the same ACT score. Isn’t it a bad result only if there is some nefarious reason to prefer legacies?
I point this out because it’s my opinion that at the most selective schools, what legacy primarily does is give an advantage to legacies over other very similarly qualified applicants. This is still a real advantage, because there are way too many highly qualified applicants for them all to be admitted.
Yes, that is the number most people would care about. Two caveats -
The actual calculations have to be done using odds ratios. You can’t just multiply the probabilities by 3.35. That won’t work.
Hurwitz’s own model does not use criteria to measure whether two candidates are equally qualified using things like SAT scores. Here’s a quick summary:
a) other researchers have models that control for race, gender, SAT, recruited athletes. They find that primary legacy increase admissions odds by “only” a factor of 3.89. For example, using the above math a 10% probability becomes a 30.2%. By the way, this is more or less what I would expect.
b) Hurwitz criticizes this model. He points out what I will call the “lookingforward” objection (no disrespect). There are all sorts of things on the application (like essays, recommendations, ECs, etc.) that the above numerical variables do not control for.
c) His solution is create a “Hurwitz model” that looks at legacy applicants who submitted applications to both the legacy school and to other elite schools. He figures that by comparing admissions results between the legacy and non-legacy schools, this will account for all the other factors on the application that are not captured by the usual variables.
d) Drumroll please … he finds that legacies actually enjoy a bigger advantage when you take into account the other factors in the application. Instead of the primary legacy advantage being a factor of 3.89, it’s 7.63. For example, a 10% probability becomes a 45.9% probability.
From what I can tell, finding out that the 3.89 becomes a 7.63 is his original contribution to the literature.
Why does addressing the “lookingforward” objection make the legacy advantage even bigger? What could account for this? Here are a few possibilities
a) Hurtwitz’s suggestion … legacies are actually weaker candidates than their SAT scores, etc. would suggest. Their ECs aren’t as good, their recommendations aren’t as good, they have a high SES so they are held to a higher standard, etc. The soft factors actually work against them. Dear old mom and dad’s alma mater overlooks this, but other schools don’t.
b) Other schools see that the candidate is a legacy somewhere else. To protect their yield, they tend not to admit them … i.e, discrimination.
c) Their applications overemphasize the fit with their parent’s alma mater … they positively ooze crimson in their essays, so Yale rejects them.
d) Al2simon’s suggestion - Hurwitz made a mistake, or there’s something in the dataset that is making Hurwitz’s results much higher than the usual results.
In Section 3.1, you will see that Hurwitz throws out all ED applications that were successful. He kept all the ones that were not successful. You will see from Table 2 that legacies use ED at a much higher rate than other students. If you keep candidates whose ED applications were unsuccessful, you have biased the sample. These applications will be highly biased towards those who have weaker applications, and the soft factors will account for their fair share of the weakness. I suspect that this has skewed Hurwitz’s results … how much, I do not know. I am pretty sure this means that 7.63 factor is too high. There are some tricky points here, and I would want to think about this more before commenting further.
Could it be that there is a large group of (let’s call it) “Ivy-admissible” candidates–not slam-dunks at any particular school but likely to be admitted to several, but not all of them? If this is the case, then legacy preference could primarily serve to sort some of these applicants into the legacy school. In other words, they might have a substantial advantage in getting into the specific legacy school, but not that much (if any) advantage in being admitted to at least one peer school. If that’s the case, is there a problem? And would that be consistent with Hurwitz’s findings?
To put it another way, how common is the student who is admitted to a highly selective legacy school, but is not admitted to any other similarly selective school? If such students are very uncommon, I don’t think legacy is a very big deal, because it’s not reducing the total number of seats at schools in that tier of selectivity.
The elite colleges themselves want to make the admissions process appear to be a mysterious black box to outsiders, since they may not want to have to explain in public the reasoning for the magnitude of each thing that they look for. For legacy, they probably want alumni parents to believe that the preference is large, but probably want others to believe that the preference is small.
The elite colleges themselves want to make the admissions process appear to be a mysterious black box to outsiders, since they may not want to have to publicly explain why they prefer certain specific things or not.
(Note that this may be different from, for example, a moderately selective public university, where there may be greater demands for transparency in the admission process, and where baseline academic criteria like HS courses/grades/rank and test scores more effectively differentiate between the applicants.)
@al2simon wrote “c) Their applications overemphasize the fit with their parent’s alma mater … they positively ooze crimson in their essays, so Yale rejects them.”
I cannot picture this. Please give me an example of how an essay would inadvertently reference the characteristics of a different university.
My daughter’s admissions essays were only minorly tailored, if at all, for different elite schools, and it was mostly about a few program offerings or opportunities that interested her at that school rather than being all about their unique “fit” for her. I suspect she wouldn’t have bothered to do any tailoring at all, except for the fact that she applied to a mix of engineering and non-engineering programs and it would have looked pretty bad to say how much she wanted to be in their engineering school if the school didn’t actually have an engineering school. There were of course some custom essays she had to write for various schools, but the other schools didn’t see them. And I didn’t think her end results correlated strongly with the quality of her essays, some of which were very good and some of which clearly weren’t going to wow anyone.
Maybe my writer kid will figure out how to make her essays ooze brown, but it’s a mystery to me and isn’t something my first kid figured out either.
Well, I hope it was clear I was being a bit tongue-in-cheek, but if you want someone to make up a story -
The child of a Penn alumnus/alumnae absorbs the pre-professional vibe at Penn. Their essays mention that they are passionate about an intense learning environment that will equip them with the tools they need to make impact on the world as soon as they graduate. Sounds great to the Penn people. A University of Chicago admissions reader could pick up on a few such sentences and be subtlety turned off at the absence of a purely intellectual vibe. It might tip the applicant into the reject pile when the final cuts are being made. A small something to be sure, but with single digit admissions rates who knows what will tip the balance?
@Hunt: In one section of the paper, Hurwitz breaks down his analysis of the legacy effect by tiers of institutional selectivity. If I recall correctly, Tier 1 was colleges with an admission rate below 10%, which in 2007 (the year covered) pretty much was limited to HYPS. The Tier 2 cutoff was 13.something%, so still very selective institutions at that time. He concludes that the legacy effect is greatest (by far) at the Tier 1 institutions, and (according to his data) represents a 50+ percentage-point advantage over non-legacies.
It’s not clear what the size of his “Tier 1” legacy population was, but it was probably somewhere north of 1,000. And if you want to make it a little worse, that year Princeton still had ED, so Hurwitz wasn’t even taking into account legacies accepted ED at Princeton. Honestly, that would tend to suppress the legacy advantage.
Such stimulating intellectual conversation! Back to basics, if you ask AOs about the legacy applicant advantages, most likely you will be told that legacy applicants are guaranteed a “second read”. I suppose that means the composition of the applicants getting the chance for a second read will have a larger percentage being legacy applicants. If we believe AOs when they say that’s the “only advantage”, then it also means moving ahead from there, applicants’ legacy status will be “blinded”, and admissions decisions will be made by factors other than legacy. How much a boost can there be? I can’t say one way or another. You be the judge… (and I am talking about some of the ultra selective, HYPetc)