National Liberal Arts Colleges SAT/ACT Scores, 2023
School
Adjusted SAT/ACT Composite Scores
School
CDS Reported Median Composite SAT/ACT Scores
1
Williams
1,495
1
Swarthmore
1515
2
Swarthmore
1,493
2
Williams
1514
3
Bowdoin
1,487
3
Pomona
1513
4
Pomona
1,487
4
Wellesley
1507
5
Wellesley
1,485
5
Bowdoin
1506
6
Claremont McKenna
1,478
6
Amherst
1501
7
Amherst
1,478
7
Middlebury
1500
8
Middlebury
1,465
8
Claremont McKenna
1500
9
Vassar
1,465
9
W&L
1495
10
Haverford
1,464
10
Vassar
1493
11
Barnard
1,461
11
Barnard
1486
12
Colby
1,457
12
Haverford
1485
13
Carleton
1,457
13
Colby
1479
14
W&L
1,455
14
Carleton
1476
15
Hamilton
1,448
15
Hamilton
1473
16
Colgate
1,446
16
Colgate
1471
17
Smith
1,444
17
Smith
1467
18
Richmond
1,436
18
Richmond
1460
19
Grinnell
1,432
19
Grinnell
1460
20
Davidson
1,413
20
Davidson
1443
21
Macalester
1,411
21
Wesleyan**
1439
22
Bates
1,399
22
Macalester
1425
23
Wesleyan**
1,394
23
Bates
1424
24
USAF
1,384
24
USAF
1400
25
USMA
1,358
25
USMA
1358
26
USNA
1,328
26
USNA
1328
Notes:
Adjusted scores are for all matriculating students
Converted ACT scores to SAT using ACT Concordance Table
Adjusted scores assume matriculating students not submitting scores were slightly below the 25th percentile of SAT/ACT scores of those who submitted.
Data taken from 2023-2024 Common Data Set
Colby and USNA are estimates
Wesleyan: CDS only published scores of all matriculating students. Other schools publish scores in CDS of only those matriculating students who submitted scores when applying
I’m missing something. Fn 1 says adjusted scores are for all matriculating students. But then you say in the Wes fn 6 that other schools publish scores of only those matriculating who submitted scores. How can you rank the schools on this factor with that particular data missing among those schools that are TO (and for 2023, I’ve lost count of which schools are TO in addition to Wes and Bowdoin).
I’m actually surprised that there isn’t more interest over the discrepancies in reporting by the local data intelligentsia, who usually don’t pass up opportunities to “discuss” these things.
I’m sure I’m missing something, but I would have expected Wesleyan’s relative scoring to be better, not worse, by adjusting the (median?) scores of other schools in the way described in fn 3 (assuming all non-submitting students are below the 25th percentile, which seems punitive given that many/most kids falling just below the 50th percentile, and some above it but below the 75th, will choose to not submit based on typical advice).
So, Wesleyan’s median CDS reported score puts it at 21, but then when you adjust the scores of the other schools with a seemingly punitive model (while presumably not doing the same to Wesleyan because it already gave you 100% of the data), their relative scoring goes down?
It’s a little more confusing at first read because of the way fn 6 is written. “Wesleyan only published scores of matriculating students.” If you’re not paying attention, you might drive away thinking Wes was the school that was withholding data, when in fact they’re the one (and only?) school that gives you the complete data set.
I don’t quite see it, but I’m sure I’m missing something.
What doesn’t seem logical is Wesleyan’s website showing its “Class of 2027” profile and showcasing the test scores of students who were “Admitted” (but not necessarily Attending). It seems disingenuous to publish the “Admitted” scores, especially when the school’s yield is less than 33%.
Yet admitted scores are actually more helpful than enrolled scores for prospective applicants, because that’s reflective of the pool the applicant will be competing in. I wish more schools would publicize admitted student scores.
Why is it disingenuous or illogical? It’s the admissions page.
And, further down the page, they seem to provide exactly what you seem to think they should disclose: a profile of first year students, with demographic information ranging from their median test scores to the date of their last vaccine. I’m not sure what more they’re supposed to divulge.
I’m also under the impression, but do not independently know, that Wesleyan includes in their composite score reporting the scores of students who didn’t submit their scores for admissions purposes. I also understand that practice is not necessarily followed by all TO schools.
I’ve always thought Wesleyan was as transparent as any school. As an example, the content you cite is headed by big, bold red letters making it clear what they are doing:
Why only the scores of Admitted Students, when over 2/3rds of these kids are deciding not to attend Wes? It’s test score reporting inflation. More relevant is what the scores are of matriculating students who submitted scores.
In addition to those data points, Wes provides more demographic information for the entering class of students than you could possible want or care about.
Isn’t this what you’re saying they should disclose?
I would also add that it’s not obvious that one or the other group of scores are more relevant. It’s the admissions page. The percentile scores of those offered admission is at least as relevant to an applicant as the scores of those who chose to attend.