<p>
</p>
<p>{Warning, nerd alert}</p>
<p>Wouldn’t that admit rate actually be an undefined value? After all, what’s the denominator of that equation? If zero people applied, and zero people got in, then you’re dividing zero by zero, which is undefined.</p>
<p>Now, one could argue that the denominator would actually be all of the people in the world who didn’t apply. But that seems illogical, for an admit rate is defined to be the number of people who were admitted divided by the number of people who actually applied, not the number of people who theoretically could have applied or the population of the world, or any other population.</p>