well schools are missing so this list isn’t useful. Bowdoin’s acceptance rate was 8.9%. It’s not on the list. Who knows what else is missing. Whoever made the list didn’t use accurate data so, even if you are one looking for schools with lowest acceptance rate, this isn’t a good list.
@Publisher if they are only using acceptance and mid SATs then why wouldn’t Ga Tech be on this list? 2019 Acceptance Rate 20.5% and Mid SAT 1400-1530. Even if you use the 2018 data (Acceptance Rate 22.5% and Mid SAT 1400-1520) it should be well above many on this list. I am calling shenanigans.
What basket of measurements would we use to compare a broader range of strengths than just test-taking ability? Using test scores may be simplistic but isn’t a completely useless proxy for the strength of an applicant pool. The fact that a list like this “looks right” does not necessarily mean it is wrong for a large population of applicants.
The difficulty of acceptance depends much on the applicant’s profile. For example, for an upper-middle class high achieving STEM female applicant with SAT 1600, Caltech would be much easier than Stanford to get into, but for a similar applicant with SAT 1500, Stanford would offer a better chance.
Acceptance rate is the measure of popularity. If a school is unpopular (few want to attend), the acceptance rate will be high to fill the available seats.
The list is primarily based on 2017 SAT score (25/75th percentile). Colleges that were test optional in 2017 were not included. So Bowdoin is excluded, but more recent test optional colleges, such as Chicago and Colby, are included.
As summarized below, #50 Case Western edged out GA Teach by 5 SAT points (list uses 2017 SAT scores). However, I agree that GATech’s lower acceptance rate should have at least put it on the list. It’s telling that the list does not specify the methodology details, such as a specific weighting formula. This makes me expect the source is not reliable.
50 Case Western -- 1340/1520 = 1430 average and 33% acceptance rate
[QUOTE=""]
50 GeorgiaTech – 1350/1500 = 1425 average and 23% acceptance rate
This article/list is just lazy. Old data (for being updated) and even then some of the choices/omissions don’t make sense even by their own criteria. Again I will use GaTech data, but from 2017: Sat 1360-1490 and 23.3% acceptance. Any reason why it shouldn’t be in the mid to upper 40s in this list looking at the other stats?
Based on a thorough methodology, Bowdoin should be included for consideration, since it requires (and reports) scores for all matriculating students. However, its SAT middle-range (1290–1510), at 30 points lower than the 50th ranked school, leaves doubt as to whether it would make the ranking. Murkier, though, is on what basis a school without an available Common Data Set such as Colby has been included.
The list is using NCES/IPEDS reported SAT scores, not CDS or self reports. SAT scores for test optional colleges are reported as “Blank” in IPEDS, regardless of whether they self report for all students on their website or CDS. 2017 IPEDS shows the following scores for Colby and Bowdoin.
Colby – 1340 / 1500 (matches scores on link from OP)
Bowdoin – “Blank” / “Blank”
So many stats can be manipulated. Superscore vs. not. Test-optional vs. not. Simply deciding whether to weigh test scores heavily when admitting or not. ED round. Heavily using the WL. Deferred admissions. Taking in a lot of transfers (sometimes with lower bars of entry) or at least not in the fall as freshmen or sending them abroad at least fall freshman year (so that their stats don’t count, at least in the eyes of US News; don’t know if they do for NCES/IPEDS). Some schools utilize these methods more than others.
This appears to be using SAT scores for students who actually attend. Not sure what that has to do with admission. Although it would still be an imperfect measure, the analysis should use SAT scores for the admitted students, but that’s not available in IPEDS and the authors were presumably too lazy to do the work to find those details out.
Re #35, since admitted-student information is typically released in non-standardized forms, such as press releases, it wouldn’t create a reliable basis for comparison, IMO. I think a strength of the linked analysis, actually, is that it uses settled data. However, it obviously would benefit from an update.
Colleges should release more data on their applicant pools as well as more granular data on their admitted student bodies. Wouldn’t that be a better approach?
@Twoin18, I prefer looking at test scores of attending students as it shows the strength of the student body in a truer fashion (even if it isn’t as helpful for admissions).
Hard to say as ED & EDII become more popular and have a significant impact on admit rates. But, in general, the more information available, the more one can tailor research to one’s interests.