CSM: College presidents plan 'U.S. News' rankings boycott

<p>Cadbury, hearty greetings to you - glad you are here and now that you are, I hope you will remain stuck in whatever else comes out of this conversation. </p>

<p>Your points that “basically anyone could rate the colleges in terms of reputation, but that’s probably only true for USNWR’s top 30-50, depending on how hooked into higher education you are” as well as that the 12 signatory presidents - who are now actively engaged in a concerted effort to rally support and raise public awareness - “are not the only ones expressing GENUINE concerns” are spot on. While the signatory colleges are not ranked at the top, many CC posters and readers are probably aware that of those 12 colleges, only 3 - Wheelock College, Heritage University, and Bethany College, are not on CC’s radar (by which I mean not on the list of colleges that have their own forum). By the same token, Moravian College may not have a very active forum but it is does have one, so indeed these IHEs are recognized and prominent LA colleges and not only by by USNWR’s system. </p>

<p>It has been said before, on this thread and a few others, that many lesser-known colleges believe they have more to gain with alternate types of evaluations and the “brand name” heavy weights have a vested interest in the status quo. No matter, there is a great deal at stake for all those in this game. Not for nothing is the USNWR rankings so often compared to a beauty contest - as Bethie put it, way back in post 34, there is a huge market for rankings that cater to those “prestige hounds” who want or feel they need a ranking list so they claim bragging rights to the fact that they’re at the #1 or a top five or top ten school etc. Ironically, at this early juncture, there are many ready to pounce and to judge the success or failure of the this reform movement based on the prestige factor of the signatory colleges. Once again, we get back to prestige as the make or break factor. I think the whole point is to look beyond all those vague barometers of prestige when it comes to deciding exactly what constitutes critical mass and the “tipping point”. Media coverage of the boycott is also a factor that will play an important role precisely because, in and out of academe, we are all stakeholders - hopefully something better will come out of all of this and that of course does not preclude changes to USNWR itself because colleges like U of Chicago, SL, and even Moravian do derive benefits from a good ranking system and they do care how they are ranked. Above all, we all want a ranking system that is based on confidence in the data and holds the public trust. The Annapolis meeting and the other conferences scheduled later in the year to discuss alternate forms of college assessment will no doubt challenge USNWR hegemony and open the playing field. I can’t help wonder what role, if any, the Education Sector’s announced new rankings system - based on NSSE and Collegiate Learning Assessment (CLA) parameters will play in all of this.</p>

<p>Boston Globe: Wheelock raises cry on college rankings. Calls magazine’s list subjective and unfair":</p>

<p><a href=“http://www.boston.com/news/local/articles/2007/05/14/wheelock_raises_cry_on_college_rankings/[/url]”>http://www.boston.com/news/local/articles/2007/05/14/wheelock_raises_cry_on_college_rankings/&lt;/a&gt;&lt;/p&gt;

<p>The media blitz continues: Bloomberg has the following article “College Rankings Don’t Offer `Alpha’ for Parents” by John F. Wasik.</p>

<p><a href=“Bloomberg Politics - Bloomberg”>Bloomberg Politics - Bloomberg;

<p>John Wasik’s list of criteria might very well rate the top private elite colleges with even more lofty ratings than the current U.S. News scheme. His list sure doesn’t favor itty-bitty liberal arts colleges.</p>

<p>Ok, in my opinion the US News list is the worst possible place to start a college search. Williams would be disastrous for my son. The sportsy orientation would make him gag. Amherst sounds far too preppy for him. Midd is too close to home. I could go on. We started the college search by taking a good look at our son. Who is he, how does he learn best, what are his best friends like, where would he find people who would inspire him, what types of programs would help him develop his potential, where would he find the most joy, intellectually and socially? After an 18 month search, he answered those questions FOR HIM by choosing Grinnell. For another kid, it might be Beloit, for another, USC or Harvard or a local CC. This stupid list encourages parents to try to fit their kids into schools that may not work for them. It’s really hard to ignore it. As a parent, I felt I needed to help my son identify schools that would fit him, not get him into the school US News says is the best. I never bought the magazine, but as long as millions of other parents do, this truly weird ranking system will probably prevail.</p>

<p>

</p>

<p>That sounds a lot like the college search process in my family. I roll my eyes :rolleyes: when I hear people decrying the U.S. News rankings in large part because I haven’t ever met a parent who doesn’t consider other sources of information besides that magazine when choosing colleges. And DEFINITELY the kids themselves refer to all kinds of “information” sources about colleges that have nothing to do with information rankings. We see that on College Confidential every day. So what’s the big worry about the rankings?</p>

<p>Greetings.</p>

<p>In my earlier posts, I spoke about some ways that colleges game the USNWR system to improve their rankings. One of the things that struck me when reading today’s Boston Globe (see link above) is that these practices may just scratch the surface:</p>

<p>“To stand out in the pack and to enhance prestige, some colleges have hired private consulting firms to persuade their peers to give them higher scores. The consultants bombard the colleges with brochures, other promotional literature, and even potted plants with notes attached, said some of the college presidents involved in the push to stop reputational rankings.”</p>

<p>

</p>

<p>This list is nonsense.
–debt-equity ratio: My Ss will graduate with zero debt. Not because their colleges are generous with financial aid, but because we, the parents, are bankrolling them to the max. Harvard has one of the lowest debt-equity ratios of all colleges; while it is known to be generous with financial aid, it is also known to attract families that can pay full fare. Criterion #1 is worthless. </p>

<p>–Price-earning ratio. It does not even include students’ personalities. A pure math star has a lesser chance of earning big money than a student who pursues an accounting degree. Criterion #2 is also worthless.</p>

<p>

</p>

<p>Glad he realizes this!</p>

<p>

</p>

<p>Yes, indeed. And those who limit themselves to USN&WR rankings and ignore their children’s needs and personalities have only themselves to blame if the fit is poor.</p>

<p>Here’s a link to yet another editorial by one of the presidents who is involved in the boycott. She doesn’t make many new points about the list of questionable practices driven by the rankings, but substantiates Cadbury’s contention that the rankings contribute to rising tuitions. She also notes the ways in which the rankings create a strong disincentive to “admit more low-income students from urban public schools who might lower the retention and completion rates” (even if an institution like hers has that as an institutional value).</p>

<p><a href=“Rank this, U.S. News”>Rank this, U.S. News;

<p>Isn’t ironic to read the various “admissions” by the college presidents who speak against the USNews! So, what do we have in the end? The admission that the Peer Assessment has become a tool for dishonest behavior, in admission of the lack of “real knowledge” about other institutions. Well, call me a cynic, but have we not decried that a long time ago --at least the few of us on CC who dropped the blinders of unwavering support for their alma mater. </p>

<p>Now, are we really expected to believe that a school resists to rein in the rampant explosion of costs because of a few arcane points on the USNews? Isn’t that a tad too simplistic? Further, are we really supposed to believe that the 1.5% of the entire USNews ranking that relates to the admit ratio has compelled the schools to become marketing monsters? Heck, doesn’t fiddling with the alumni donation yield MUCH better results, or fiddling with the SAT as Middlebury discovered in the past years? </p>

<p>So, in conclusion, whom should the “vilified” consumers trust? One the one hand, we have schools that do not hesitate to manipulate and lie abouut their true results and selectivity. On the other hand, we have a commercial entity that attempts to collate and report a set of objective measures and balance it with 25% of subjective data. </p>

<p>I guess that we all have to believe that the Consumer Reports --and their susbscribers-- are responsible for the dishonesty and shoddy quality of many manufacturers!</p>

<p>marite- am I wrong in thinking that what colleges list is the Stafford Loan debt for the debt ratio? It’s always about what four years of Staffords would be. The schools that have higher tend to not claim to meet 100% need, either.</p>

<p>USNWR was a jumping off point for our family-- not a Bible. We were interested in class size ratios, and the selectivity stats as a tool to see if our child would even be a competitive applicant, or if the school was out of reach. Neither of my college boys picked the “highest ranked” (according to UWNWR) of their acceptances.</p>

<p>We are starting the process again with our daughter, who is totally different from her older brothers. I doubt we’ll use many of the same criteria. </p>

<p>It does irk me to no end that she receives a ton of unsolicited college spam from schools like Washington in St. Louis where she is clearly not a competitive candidate, this on top of their very low acceptance rate. I’m glad there is the selectivity data to help us steer clear of schools where the application fee would be a waste of money.</p>

<p>Xiggi - I have to agree with you here. As imperfect as the USNWR ratings are, at least they are a form of objective information compiled by a relatively disinterested third party (and thus more likely to be accurate than any spin put out in a college’s own marketing materials or any totally subjective assessment process). In this sense, to use an SAT analogy, USNWR is to prospective college students as the SAT is to college admission committees. Both can be criticized for being subject to manipulation and for not saying much about the “fit” between a college and a student, but they are both better than nothing. And in both cases, if you aren’t coming out on top in the ratings, the first tendency is to shoot the messenger.</p>

<p>If you are looking for statistical measures that are even less subjective, a few years ago someone did a study of students who had been admitted to two or more top colleges to see which ones they chose. So for example, in the case of a student admitted to both Harvard and Penn, the students might choose Harvard 80% of the time, but in the case of a Penn-Cornell dual admit, the students might choose Penn 60% of the time. IIRC a table of the results appeared in the NY Times. The number of students surveyed was limited and not every single college was covered, but the results represent a sort of “voting with your feet” - the college with the highest average % of wins in dual admits (IIRC this was Harvard) would be #1 in student preference, etc.</p>

<p>Instead of debating how to improve the rankings, note the words in the thread title “rankings boycott.” It is the one-size-fits-all nature of the USNWR rankings that are so pernicious. There are quality publications which present school information and data without resorting to rankings.</p>

<p>texastaximom shows how the information USNWR collects is valuable, and it may well be that USNWR collects and presents the most data. Drop the rankings, and we can debate how best to present the information.</p>

<p>I am a subscriber to the USNWR web site, which can be used in as in texastaximom’s example, but in a quite limited way: You can check only “Most selective” and get an alphabetical list of schools USNWR considers to be in that category, but you can’t see the raw numbers or sorted by value, such that you can’t see the lowest “Most selective” next to the highest “More selective” school, which may be a tenth of a percentage point apart. I’d rather see the raw numbers than USNWR’s opinion.</p>

<p>With such a vast array of available data, on-line access may be the only reasonable way to fine-tune our college search activities.</p>

<p>texastaximom:</p>

<p>There’s no mention of Stafford loans in the USN&WR list of best value colleges. The fine print mentions 3 variables on which the list was based:

  1. ratio of quality to price (overall score in USN&WR rankings/cost to a student receiving average grant meeting financial need.
    2.percentage of undergrads receiving grants meeting financial need.
  2. average discount: percentage of a school’s total costs covered by average need-based grant.</p>

<p>So here are the best value universities and LACs, as per USN&WR 2003 edition. The list may have changed a bit since then, and it does not include merit-based aid:

  1. Caltech.
    2 Princeton
  2. Harvard
  3. Stanford.
  4. Rice
  5. MIT
  6. Columbia
  7. Yale
  8. Dartmouth
  9. Penn</p>

<p>LACs:

  1. amherst
  2. Williams.
  3. Swarthmore
    4.Washington & Lee
  4. Grinnell
  5. Colgate
  6. DePauw
  7. Pomona
  8. Wabash
  9. Macalester.</p>

<p>Percy, I believe you’re making a reference to this study:</p>

<p>A Revealed Preference Ranking of U.S. Colleges and Universities
By Christopher Avery, Mark Glickman, Caroline Hoxby, Andrew Metrick</p>

<p><a href=“http://www.nber.org/papers/W10803[/url]”>http://www.nber.org/papers/W10803&lt;/a&gt;&lt;/p&gt;

<p>Vossron, have you tried to select a category from WITHIN a table? For instance, you can sort the LAC table for selectivity and compose your own table. The following one does show the rank of the 30 most selective LAC schools, plus a few ranked in the 40-50 for comparison’s sake. The table shows the selectivity rank, name of school, SAT score, percentage of top 10%, admit rates, and lastly the overall rank by US News. </p>

<p>1 Harvey Mudd College (CA) 1380-1560 91% 36% 14
1 Pomona College (CA) 1380-1530 88% 19% 7
1 Amherst College (MA) 1350-1560 87% 19% 2
4 Williams College (MA) 1340-1530 88% 19% 1
4 Swarthmore College ¶ 1350-1530 88% 22% 3
6 Haverford College ¶ 1290-1470 91% 26% 9
7 Claremont McKenna College (CA) 1310-1490 84% 21% 12
8 Middlebury College (VT) 1280-1475 84% 24% 5
9 Barnard College (NY) 1290-1450 83% 27% 26
10 Bowdoin College (ME) 1320-1470 78% 25% 7
11 Davidson College (NC) 1280-1440 77% 27% 10
11 Washington and Lee University (VA) 1300-1450 76% 29% 17
11 Wellesley College (MA) 1310-1480 77% 34% 4
14 Wesleyan University (CT) 1300-1490 71% 28% 10
14 Carleton College (MN) 1320-1500 71% 29% 6
16 Vassar College (NY) 1340-1450 67% 29% 12
16 Grinnell College (IA) 1280-1480 73% 45% 14
18 Colgate University (NY) 1280-1430 68% 27% 16
19 Hamilton College (NY) 1270-1440 70% 36% 17
20 Oberlin College (OH) 1270-1460 68% 34% 22
20 Colby College (ME) 1280-1430 67% 38% 20
22 Bucknell University ¶ 1230-1390 68% 34% 29
22 Scripps College (CA) 1270-1450 69% 46% 26
24 Macalester College (MN) 1260-1450 65% 44% 24
25 Bard College (NY) 1240-1440 63% 32% 36
25 Colorado College 1220-1400 66% 38% 26
27 Bates College (ME) 1280-1410 57% 29% 23
28 Gettysburg College ¶ 1210-1350 66% 43% 45
29 Kenyon College (OH) 1240-1420 59% 36% 32
29 Lafayette College ¶ 1180-1370 62% 37% 30
29 Bryn Mawr College ¶ 1210-1400 62% 46% 20
29 Whitman College (WA) 1240-1450 60% 49% 36</p>

<p>41 Smith College (MA) 1150-1380 61% 48% 19
43 Reed College (OR) 10 1280-1470 57% 45% 53
43 Mount Holyoke College (MA) 1230-1410 51% 52% 24</p>

<ol>
<li> California Institute of Technology 53% $13,694 67% </li>
<li> Harvard University (MA) 49% $16,346 63% </li>
<li> Princeton University (NJ) 51% $16,917 61% </li>
<li> Yale University (CT) 42% $16,268 63% </li>
<li> Massachusetts Inst. of Technology 60% $18,587 58% </li>
<li> Stanford University (CA) 43% $18,767 58% </li>
<li> Dartmouth College (NH) 50% $18,804 58% </li>
<li> Rice University (TX) 34% $15,561 52% </li>
<li> U. of North Carolina—Chapel Hill * 31% $14,464 47% </li>
<li><p>Duke University (NC) 38% $20,172 54% </p></li>
<li><p>Williams College (MA) 43% $15,621 64% </p></li>
<li><p>Amherst College (MA) 46% $16,177 64% </p></li>
<li><p>Wellesley College (MA) 58% $17,547 59% </p></li>
<li><p>Skidmore College (NY) 41% $11,952 66% </p></li>
<li><p>Pomona College (CA) 53% $17,740 59% </p></li>
<li><p>Swarthmore College ¶ 48% $18,692 57% </p></li>
<li><p>Middlebury College (VT) 44% $19,216 56% </p></li>
<li><p>Bowdoin College (ME) 44% $18,965 57% </p></li>
<li><p>Macalester College (MN) 69% $18,774 51% </p></li>
<li><p>Grinnell College (IA) 54% $18,395 49% </p></li>
</ol>

<p>The numbers are % receiving grants based on need ('05) Average cost after receiving grants based on need ('05) Average discount from total cost ('05)</p>

<p>Yes - that was the study. </p>

<p>Results are presented in tabular form here:</p>

<p><a href=“http://graphics8.nytimes.com/images/2006/09/17/weekinreview/nwr_LEONHARDTweb-ch.jpg[/url]”>http://graphics8.nytimes.com/images/2006/09/17/weekinreview/nwr_LEONHARDTweb-ch.jpg&lt;/a&gt;&lt;/p&gt;

<p>The way I read the results, the rankings are as follows:</p>

<ol>
<li><p>Harvard (“wins” more than 50% of the time against all other schools).</p></li>
<li><p>Yale (loses only to H)</p></li>
<li><p>MIT (loses only to H and Y)</p></li>
<li><p>Stanford (3 losses)</p></li>
<li><p>Princeton (4)</p></li>
<li><p>Brown (5)</p></li>
<li><p>Columbia (6)</p></li>
<li><p>Darmouth (7)</p></li>
<li><p>Penn/Cornell (tie - 8)</p></li>
<li><p>Georgetown (10)</p></li>
<li><p>Duke (11)</p></li>
</ol>

<p>I think the validity of the study may be somewhat crippled by a limited data set - they surveyed 3,200 HS students - I can’t believe that there were that many dual admits out of that sample size that they could say with a high degree of statistical confidence whether for example the Penn-Dartmouth match (54% chose Darmouth) would be true in a national sample. But roughly speaking they seem to get the order roughly “right” vs. other ranking schemes - there were no totally flukish results like ranking Tufts ahead of Princeton. But I would question whether the relative positions of #6 and below are completely correct (to the extent that such a beauty contest means anything to begin with).</p>

<p>Note also that some schools are missing from the table completely - I’m guessing because they didn’t get a statistically significant sample - Chicago, CalTech, Wash U. , Hopkins, Rice, Vanderbilt, etc.</p>

<p>Are you confusing the New York Times table with the results of the whole study? </p>

<p><a href=“http://post.economics.harvard.edu/faculty/hoxby/papers/revealedprefranking.pdf[/url]”>http://post.economics.harvard.edu/faculty/hoxby/papers/revealedprefranking.pdf&lt;/a&gt;&lt;/p&gt;

<p>I wasn’t “confusing” it because I had never seen the whole study before - the results I gave were purely based on the NY Times table. Thanks for posting the link to the full study.</p>

<p>To some extent, the full results could be perceived as being a “popularity contest” rather than a ranking of quality - for example, they put U. Chicago (with the unjust reputation as “the place where fun goes to die”) as #28, behind Brigham Young and Wesleyan, among others. I think it would be hard to find anyone who seriously thinks that Chicago is the lesser institution compared to those.</p>