Berea Named Best Liberal Arts College 2025 by Washington Monthly

The limits of this data is that it results from looking in the rear view mirror. Nothing makes these limits clearer than the case of New College, which ranks top 10 nationally (#6), but really is not the same college anymore, is it?

7 Likes

It’s 9 years from entry into college, meaning, from freshmen year. College Scorecard has the same wordimg. For most people, that would really be *five* years after graduation.

Does anyone have first-hand experience with Berea? What do you like / don’t like about it? Share with us your honest review of the school!

This list from Washington Monthly is far superior to the lame-o Forbes list. My kid’s college is up at #14, so I am certain that this is the most accurate, best and brightest ranking out there!

In seriousness, Berea is an unexpected surprise but no doubt its position as a well-known working college is why it performs well.

Edit: I am always amazed when a college I have never heard of, in this case Boricua College, appears on lists like these, seemingly out of nowhere. Does anyone have knowledge of this school?

4 Likes

Nope. Haven’t googled yet. Maybe in Puerto Rico
as the term is related to Puerto Ricans.

It looks like it’s actually in New York, but founded by Puerto Ricans: About Boricua College — Boricua College

1 Like

Boricua College - Wikipedia

Maybe because it also serves adults primarily over the age of 25, so perhaps career changers or people looking for qualifications to earn promotions and higher salaries? It’s also relatively inexpensive, with tuition at $11k a year. Presumably most students commute. I didn’t bother to look at the methodology, but possibly there is a good amount of FA or grants so there is less debt on graduation.

3 Likes

9 years after entry – not 9 years after graduation. They also may be including dropouts. The wording is ambiguous about whether they are using all students who enter college including dropouts, or using grads vs dropouts to compare predicted vs actual. I suspect the former, as I can’t imagine they’d have sufficient sample size of dropouts earnings for many of the small LACs that are ranked.

“We measured post-college outcomes using actual versus predicted earnings of students nine years after college entry (using just two cohorts of data). This captures the outcomes of graduates as well as dropouts.”

The familiar colleges that ranked highest for predicted vs actual earnings are below. It’s all colleges that have a large portion of students in high earnings majors, such as CS/tech, business, and maritime. This list makes me expect that their predicted earnings formula doesn’t adequately consider major distribution.

  1. Harvey Mudd – Predicted = $82k, Actual = $124k
  2. MIT – Predicted = $89k, Actual = $118k
  3. Bentley – Predicted = $72k, Actual = $99k
  4. Stanford – Predicted = $80k, Actual = $107k
  5. CA Maritime – Predicted = $62k, Actual = $85k
  6. Caltech – Predicted = $97k, Actual = $119k
  7. Maine Maritime – Predicted = $60k, Actual = $81k
  8. SUNY Maritime – Predicted = $63k, Actual = $84k

Like almost all college rankings, the overall ranking of best colleges has little meaning. The ranking is based on arbitrarily selected weighting of criteria, and most of that criteria is unlikely to be meaningful to specific students considering a college.

I suppose the ranking could be used by students to get names of colleges they might not have previously considered, such as Berea, then they could research those identified colleges further. Many students are not aware that Berea has $0 tuition for all students, with 81% attending at zero cost (according to website), which seems to the largest contributing factor to why Berea does so well in this ranking.

However, what portion of students would prioritize similar criteria and weightings that WM used in their rankings formula, such that good fit colleges for that student would correspond well to the rankings list – student has equal priorities for access (stats related to Pell Grant students), affordability (net price for students with low incomes and average debt among students who take on debt), outcomes (graduation rate and earnings related stats, apparently without considering major distribution), and service (portion of students joining and support for ROTC, military service, peace corps, AmeriCorps, 
)?

3 Likes

I agree - and I wish people knew that before they start naming schools in the US News top 20.

I surmise that at most schools, including those deemed high pedigree, major drives earnings more than anything.

But it’s fun to debate on all this
.if people didn’t debate, there’d be no use for ranking :slight_smile:

2 Likes

I do! It’s in NYC, in the panhandle part of Manhattan (if you know what I mean.) I visited it on purpose when I spent a day walking from the top to the bottom of Manhattan, because McKenzie Scott had just given it a big donation. It’s not part of the CUNY system, it’s private, but it’s very affordable and was set up by Puerto Ricans to educate the neighborhood’s residents. Lots of teaching degrees etc. It is in a really lovely hilly neighborhood, and the building is gorgeous. The other side of the building is a free museum, the Hispanic Society Museum, and it has some great old paintings as well as some modern murals which are really cool.

5 Likes

Any magazine other than USNWR has to come up with a ranking method that produces a list that is a little shocking in order to get people talking and sell magazines. Washington Monthly’s strategy seems to be to create lists that go back and forth between famous names and no-names. Thus Metro State (MN) wedged in between Notre Dame and Grinnel.

(Seriously, though, it warms my heart to see Metro State get a little love. They really do a good job. )

3 Likes

WM dramatically changed the rankings formula this year. 10 years ago, big publics dominated the rankings. In recent years, the formula changed, making the top few Ivy+ type private colleges. This year’s rankings formula changed again and gives equal weight to the four categories below. Typical Ivy+ colleges only do well in 2 of the 4. This allows other colleges that do well in 2+ categories to have similar or superior overall ranking to familiar Ivy+ colleges. CSUs seem to do a especially well overall, with the new formula. A list of colleges that do best in the 4 categories are below:

Access (good Pell related stats)
1 . Southern New Hampshire (57k Pell grants among the >100k online students, leading to more Pell grants than all other colleges in ranking)
2. CSU: NR
3. Liberty
4. UTX: Rio Grande
5. CSU: LA


785. Harvard

Affordability (low income students have low net price, low average loan amount among students who take loans)
1 . Ozarks (uses work study instead of tuition)
2. Berea
3. Stanford (net price for low income students is negative)
4. MIT (net price for low income students is negative)
5. CUNY: John Jay

Outcomes (good grad rate and earnings stats)
1 . Harvey Mudd
2. MIT
3. Princeton
4. Caltech
5. Stanford

Service (support/participation in ROTC/military, Peace Corps, 
)
1 . Hobart & Williams Smith (#3 in Peace Corps/Americorps)
2. Macalester (#2 in Peace Corps/Americorps)
3. U Montana
4. U San Diego (private catholic college, not UCSD)
5. U MO-Columbia


1061. Northwestern (>900 in both Peace Corps and ROTC)

Overall
1 . Berea (#2 in affordability, worst category is outcomes)
2. CSU: Fresno (#17 in affordability, worst category is outcomes)
3. CSU: Sacramento (#18 in affordability, worst category is service)
4. CSU: Chico (#36 in service, worst category is outcomes)
5. Penn (#28 in outcomes, worst category is access)


51. Harvard (good in affordability and outcomes, bad in access and service)


549. Harvey Mudd (#1 in outcomes, bad in everything else)

2 Likes

Personally, I’d want a good outcome and I believe most would agree (although not on this website). Of course, one can’t purchase something they cannot afford.

It’s interesting that none of these categories which are being used to define the “best colleges” has anything to do with the quality of the teaching and learning experience at college.

6 Likes

Maybe it’s assume all are equal - but it’s a rank based somewhat either on affordability (or perhaps diversity, via affordability), etc.

Not everyone has an access issue. Many do but not all.

Everyone wants a “good outcome” for their kids, but people differ on what they consider a “good outcome.” My feeling is that a good outcome doesn’t have to mean a high paying job immediately after graduation (or within the first few years after graduation), even though this is the easiest type of “outcome” to quantify. Some people’s careers take longer to take off. Or they might spend these years building their own business(es), with a low income but potential long term payoff. Or they may ultimately have higher satisfaction in life with a career that isn’t as high paying as some.

7 Likes

If you can afford to pay your bills (and successfully retire), there’s nothing better than this
.no matter your income.

How do you measure a good outcome in a college ranking formula? The WM rankings are unique in that the don’t consider graduation rate and earnings in an absolute sense, but instead only consider how they compare to expected. The colleges that overshoot expectations are ranked well. The colleges that undershoot expectations are ranked poorly. The “outcome” rankings are trying to measure the effect the college has on outcome, rather than the effect of being selective enough to admit top students.

This sounds good in theory. The problem is it is difficult to accurately measure predicted. As noted earlier “predicted” earnings doesn’t seem to consider major distribution, so the colleges that overperform on earnings often are ones that overperform in portion of students choosing higher paying majors/fields, rather than have highest earnings among students with that major/field. For example, the majority of #1 ranked Mudd’s students major in CS or engineering (including dual majors). However, Mudd CS majors do not have higher earnings than CS majors at numerous other highly selective colleges.

Graduation rate overperformance shows a less obvious pattern. The top ranked colleges by this metric are below. Looking up the first college (CMU: CGES), the cohort size is 3 students. 2 of the 3 students transferred out, and the 3rd did not graduate. This is reported as a 67% graduation rate in WM’s table, presumably counting the 2 students who transferred out as graduating. CMU’s website instead says, “CMU-CGES currently has a 0% graduation rate”, not counting the 2 students who transferred as graduating. Either way 3 students is too small a sample size to be meaningful. I suspect there is a unique explanation for the others as well.

  1. Central Methodist: CGES – Expected = 35%, Actual = 67%
  2. Wheeling University – Expected = 50%, Actual = 81%
  3. Maharishi International University – Expected = 40%, Actual = 65%
  4. Boricua College – Expected = 53%, Actual = 78%
  5. Trine University: Online – Expected = 39%, Actual = 64%
2 Likes

Probably not the same student. But that’s what makes this most recent version of the poll so fascinating. The publishers made sure that the mix of metrics result in the same well-rounded, well-funded, well-branded institutions wind up in the thick of it.

Aren’t Colby grads way more likely to go into financial services and Bowdoin and Bates grads more likely to go into higher ed? I don’t think anyone goes to Bowdoin or Bates because they think it is a path to making a lot of money.