The limits of this data is that it results from looking in the rear view mirror. Nothing makes these limits clearer than the case of New College, which ranks top 10 nationally (#6), but really is not the same college anymore, is it?
Itâs 9 years from entry into college, meaning, from freshmen year. College Scorecard has the same wordimg. For most people, that would really be *five* years after graduation.
Does anyone have first-hand experience with Berea? What do you like / donât like about it? Share with us your honest review of the school!
This list from Washington Monthly is far superior to the lame-o Forbes list. My kidâs college is up at #14, so I am certain that this is the most accurate, best and brightest ranking out there!
In seriousness, Berea is an unexpected surprise but no doubt its position as a well-known working college is why it performs well.
Edit: I am always amazed when a college I have never heard of, in this case Boricua College, appears on lists like these, seemingly out of nowhere. Does anyone have knowledge of this school?
Nope. Havenât googled yet. Maybe in Puerto RicoâŠas the term is related to Puerto Ricans.
It looks like itâs actually in New York, but founded by Puerto Ricans: About Boricua College â Boricua College
Maybe because it also serves adults primarily over the age of 25, so perhaps career changers or people looking for qualifications to earn promotions and higher salaries? Itâs also relatively inexpensive, with tuition at $11k a year. Presumably most students commute. I didnât bother to look at the methodology, but possibly there is a good amount of FA or grants so there is less debt on graduation.
9 years after entry â not 9 years after graduation. They also may be including dropouts. The wording is ambiguous about whether they are using all students who enter college including dropouts, or using grads vs dropouts to compare predicted vs actual. I suspect the former, as I canât imagine theyâd have sufficient sample size of dropouts earnings for many of the small LACs that are ranked.
âWe measured post-college outcomes using actual versus predicted earnings of students nine years after college entry (using just two cohorts of data). This captures the outcomes of graduates as well as dropouts.â
The familiar colleges that ranked highest for predicted vs actual earnings are below. Itâs all colleges that have a large portion of students in high earnings majors, such as CS/tech, business, and maritime. This list makes me expect that their predicted earnings formula doesnât adequately consider major distribution.
- Harvey Mudd â Predicted = $82k, Actual = $124k
- MIT â Predicted = $89k, Actual = $118k
- Bentley â Predicted = $72k, Actual = $99k
- Stanford â Predicted = $80k, Actual = $107k
- CA Maritime â Predicted = $62k, Actual = $85k
- Caltech â Predicted = $97k, Actual = $119k
- Maine Maritime â Predicted = $60k, Actual = $81k
- SUNY Maritime â Predicted = $63k, Actual = $84k
Like almost all college rankings, the overall ranking of best colleges has little meaning. The ranking is based on arbitrarily selected weighting of criteria, and most of that criteria is unlikely to be meaningful to specific students considering a college.
I suppose the ranking could be used by students to get names of colleges they might not have previously considered, such as Berea, then they could research those identified colleges further. Many students are not aware that Berea has $0 tuition for all students, with 81% attending at zero cost (according to website), which seems to the largest contributing factor to why Berea does so well in this ranking.
However, what portion of students would prioritize similar criteria and weightings that WM used in their rankings formula, such that good fit colleges for that student would correspond well to the rankings list â student has equal priorities for access (stats related to Pell Grant students), affordability (net price for students with low incomes and average debt among students who take on debt), outcomes (graduation rate and earnings related stats, apparently without considering major distribution), and service (portion of students joining and support for ROTC, military service, peace corps, AmeriCorps, âŠ)?
I agree - and I wish people knew that before they start naming schools in the US News top 20.
I surmise that at most schools, including those deemed high pedigree, major drives earnings more than anything.
But itâs fun to debate on all thisâŠ.if people didnât debate, thereâd be no use for ranking
I do! Itâs in NYC, in the panhandle part of Manhattan (if you know what I mean.) I visited it on purpose when I spent a day walking from the top to the bottom of Manhattan, because McKenzie Scott had just given it a big donation. Itâs not part of the CUNY system, itâs private, but itâs very affordable and was set up by Puerto Ricans to educate the neighborhoodâs residents. Lots of teaching degrees etc. It is in a really lovely hilly neighborhood, and the building is gorgeous. The other side of the building is a free museum, the Hispanic Society Museum, and it has some great old paintings as well as some modern murals which are really cool.
Any magazine other than USNWR has to come up with a ranking method that produces a list that is a little shocking in order to get people talking and sell magazines. Washington Monthlyâs strategy seems to be to create lists that go back and forth between famous names and no-names. Thus Metro State (MN) wedged in between Notre Dame and Grinnel.
(Seriously, though, it warms my heart to see Metro State get a little love. They really do a good job. )
WM dramatically changed the rankings formula this year. 10 years ago, big publics dominated the rankings. In recent years, the formula changed, making the top few Ivy+ type private colleges. This yearâs rankings formula changed again and gives equal weight to the four categories below. Typical Ivy+ colleges only do well in 2 of the 4. This allows other colleges that do well in 2+ categories to have similar or superior overall ranking to familiar Ivy+ colleges. CSUs seem to do a especially well overall, with the new formula. A list of colleges that do best in the 4 categories are below:
Access (good Pell related stats)
1 . Southern New Hampshire (57k Pell grants among the >100k online students, leading to more Pell grants than all other colleges in ranking)
2. CSU: NR
3. Liberty
4. UTX: Rio Grande
5. CSU: LA
âŠ
785. Harvard
Affordability (low income students have low net price, low average loan amount among students who take loans)
1 . Ozarks (uses work study instead of tuition)
2. Berea
3. Stanford (net price for low income students is negative)
4. MIT (net price for low income students is negative)
5. CUNY: John Jay
Outcomes (good grad rate and earnings stats)
1 . Harvey Mudd
2. MIT
3. Princeton
4. Caltech
5. Stanford
Service (support/participation in ROTC/military, Peace Corps, âŠ)
1 . Hobart & Williams Smith (#3 in Peace Corps/Americorps)
2. Macalester (#2 in Peace Corps/Americorps)
3. U Montana
4. U San Diego (private catholic college, not UCSD)
5. U MO-Columbia
âŠ
1061. Northwestern (>900 in both Peace Corps and ROTC)
Overall
1 . Berea (#2 in affordability, worst category is outcomes)
2. CSU: Fresno (#17 in affordability, worst category is outcomes)
3. CSU: Sacramento (#18 in affordability, worst category is service)
4. CSU: Chico (#36 in service, worst category is outcomes)
5. Penn (#28 in outcomes, worst category is access)
âŠ
51. Harvard (good in affordability and outcomes, bad in access and service)
âŠ
549. Harvey Mudd (#1 in outcomes, bad in everything else)
Personally, Iâd want a good outcome and I believe most would agree (although not on this website). Of course, one canât purchase something they cannot afford.
Itâs interesting that none of these categories which are being used to define the âbest collegesâ has anything to do with the quality of the teaching and learning experience at college.
Maybe itâs assume all are equal - but itâs a rank based somewhat either on affordability (or perhaps diversity, via affordability), etc.
Not everyone has an access issue. Many do but not all.
Everyone wants a âgood outcomeâ for their kids, but people differ on what they consider a âgood outcome.â My feeling is that a good outcome doesnât have to mean a high paying job immediately after graduation (or within the first few years after graduation), even though this is the easiest type of âoutcomeâ to quantify. Some peopleâs careers take longer to take off. Or they might spend these years building their own business(es), with a low income but potential long term payoff. Or they may ultimately have higher satisfaction in life with a career that isnât as high paying as some.
If you can afford to pay your bills (and successfully retire), thereâs nothing better than thisâŠ.no matter your income.
How do you measure a good outcome in a college ranking formula? The WM rankings are unique in that the donât consider graduation rate and earnings in an absolute sense, but instead only consider how they compare to expected. The colleges that overshoot expectations are ranked well. The colleges that undershoot expectations are ranked poorly. The âoutcomeâ rankings are trying to measure the effect the college has on outcome, rather than the effect of being selective enough to admit top students.
This sounds good in theory. The problem is it is difficult to accurately measure predicted. As noted earlier âpredictedâ earnings doesnât seem to consider major distribution, so the colleges that overperform on earnings often are ones that overperform in portion of students choosing higher paying majors/fields, rather than have highest earnings among students with that major/field. For example, the majority of #1 ranked Muddâs students major in CS or engineering (including dual majors). However, Mudd CS majors do not have higher earnings than CS majors at numerous other highly selective colleges.
Graduation rate overperformance shows a less obvious pattern. The top ranked colleges by this metric are below. Looking up the first college (CMU: CGES), the cohort size is 3 students. 2 of the 3 students transferred out, and the 3rd did not graduate. This is reported as a 67% graduation rate in WMâs table, presumably counting the 2 students who transferred out as graduating. CMUâs website instead says, âCMU-CGES currently has a 0% graduation rateâ, not counting the 2 students who transferred as graduating. Either way 3 students is too small a sample size to be meaningful. I suspect there is a unique explanation for the others as well.
- Central Methodist: CGES â Expected = 35%, Actual = 67%
- Wheeling University â Expected = 50%, Actual = 81%
- Maharishi International University â Expected = 40%, Actual = 65%
- Boricua College â Expected = 53%, Actual = 78%
- Trine University: Online â Expected = 39%, Actual = 64%
Probably not the same student. But thatâs what makes this most recent version of the poll so fascinating. The publishers made sure that the mix of metrics result in the same well-rounded, well-funded, well-branded institutions wind up in the thick of it.
Arenât Colby grads way more likely to go into financial services and Bowdoin and Bates grads more likely to go into higher ed? I donât think anyone goes to Bowdoin or Bates because they think it is a path to making a lot of money.