Welcome to College Confidential!

The leading college-bound community on the web

Sign Up For Free

Join for FREE, and start talking with other members, weighing in on community discussions, and more.

Also, by registering and logging in you'll see fewer ads and pesky welcome messages (like this one!)

As a CC member, you can:

  • Reply to threads, and start your own.
  • Post reviews of your campus visits.
  • Find hundreds of pages of informative articles.
  • Search from over 3 million scholarships.

Dickinson Announces New Rankings Policy

245

Replies to: Dickinson Announces New Rankings Policy

  • fastMEdfastMEd - Posts: 1,073 Member
    I think only lesser colleges complain about this since they are shown to be a lesser college by a certain ranking system.

    True, there are occasions when not enough people respond to the surveys and that may lower the rank (like UChicago), but that is in the minority.

    On the whole, the ranking system is reasonable. What do SAT scores and high school GPAs mean? Nothing. US News has the right criteria. Number of top quality professors, retention rates, perceived prestige by fellow institutions, etc. Those are the things relevent to college. SATs and hs gpas are useful for admissions not college quality.

    By top quality professors (that generally means nobel winners or field medals).
    Perceived prestige is also important since that is important for jobs. No matter what they say, this subjective ranking is useful. People do not think Berkeley is much better than Arizona State for no reason.
  • CitanCitan Registered User Posts: 2,287 Senior Member
    good for dickinson
  • RMassaRMassa Registered User Posts: 1 New Member
    As the vice president of Dickinson College and the parent of a college junior and a recent graduate, I would like to weigh in on this issue first to clarify Dickinson’s position and then to comment in general on what the college admission process has become.

    First, there are about 650 undergraduate liberal arts colleges in the US. That Dickinson is “generally” considered to be among the top 10% of those colleges (whether ranked by US News or not) hardly makes the college a “sore loser” nor does it make it irrelevant. To the latter point, it is one of only 5 colleges founded before 1800 that is still a small college today, and was one of the original colleges that defined American higher education for the future during the revolutionary era. (I hasten to add this, but to respond to the “sore loser” assertion, Dickinson actually moved up 4 places in USNews from last year and is ranked by Princeton Review as #11 in the college library category, whatever that means).

    The rising prominence of USNews and other rankings is symptomatic of a larger issue – our desire as a society to distill complex information quickly without having to spend a lot of time doing the work ourselves. The order in which colleges are ranked is, to the public, a surrogate for their quality – and that is inaccurate and quite simplistic. For example, Dickinson’s endowment is the 42nd largest among all liberal arts colleges. It is no coincidence that USNews ranks the college at #41. The USNews rankings correlate most significantly with institutional wealth, not with HOW an institution spends its resources (a more accurate indication of quality).

    By the way, Dickinson does not require standardized tests for admission, though 90% of our applicants do submit scores and we do use the test as one criterion in the admission process. We have taken this stance for the past ten years precisely for the same reasons we now will not promote USNews or other rankings on our website or in admissions literature: the SAT can be used by admission offices as a “shortcut” to the measurement of student quality and fit, just as institutional wealth is used as a shortcut in the rankings. We wanted to send a clear message to the members of our community that Dickinson will not do this.

    I truly believe that when colleges and others bemoan the USNews rankings, it not so much the “hatred of the rankings” that motivates, but a general frustration with a college admission process that has become commercialized and stress-producing for students and for colleges. We have, particularly in the East, lost sight of the notion of institutional fit – finding the right college for the right student. It has become all about rankings and prestige and window stickers. Colleges share equal responsibility with students and families for this state of affairs.

    One final point as illustration. In my last of ten years as the dean of enrollment at Johns Hopkins, a parent cornered me at an open house to ask about the political science department. When I told her that the chair of the department would be available later in the day, she announced that they couldn’t stay because they were driving to North Carolina and that “they” were only going to apply to “top ten” universities anyway. Hopkins was ranked number 15 in USNews that year (1998). Of course, the joke was on them: when the rankings came out in 1999, Hopkins had “risen” to number 7. Point – it’s not the rankings, but how we use them to guide our decisions that has become the problem. Dickinson is simply saying – we will not contribute to their misuse by congratulating ourselves publicly for being among the top colleges in the country.

    I hope this helps to clarify our position (which was meant only to be shared within our community, but which has obviously found its way into public).

    Robert J. Massa
    Vice President
    Enrollment & College Relations
    Dickinson College
    [email protected]
  • hoedownhoedown Registered User Posts: 3,751 Senior Member
    Thank you for taking the time to share Dickinson's perspective--that's really valuable.

    I would also challenge those who dismiss non-participants as "sore losers" with the fact that Dickinson is not alone. Reed stands out on this list. One would be hard-pressed to convince me that Reed is a "sore loser."
  • TarhuntTarhunt Registered User Posts: 2,138 Senior Member
    Corbett:

    US News can't give us those numbers because the samplings aren't random and there are not and can never be benchmarks. I'm quite sure that no factor analysis has been done or can even be done in the absence of benchmarks.

    What I was trying to say is that I think I'm perfectly capable of interpreting the numbers, and US News publishes the weightings and the most of the numbers on which its rankings depend. In that context, the US News rankings are quite useful.

    As for tiers, I don't understand why ranking one school with, say, a score of 76 as "first tier" and another school with a score of 75 a "second tier" school could possibly be more useful than publishing the scores and the data points in rank order.
  • xiggixiggi Registered User Posts: 25,441 Senior Member
    I would also challenge those who dismiss non-participants as "sore losers" with the fact that Dickinson is not alone. Reed stands out on this list. One would be hard-pressed to convince me that Reed is a "sore loser."

    Sore loser is a poor expression. However, Reed does not miss any chance to milk and exploit its position regarding the USNews rankings. I believe that USNews is making a HUGE mistake in listing Reed and other iconoclasts in the rankings: the school, as well as the remaining schools that have non-standard policies that allow them to manipulate the data a la Middlebury at will, should be delisted, or simply lumped in a special unranked category.

    In the case of Dickinson, I applaud the school decision to seize discussing the issue--a welcome departure from Reed's constant whining. I hope this would free the time and energy to publish meaningful data such as the Common Data Set, and let the potential candidates evaluate the school in comparison to Dickinson's peers.

    Schools that do not want play ball should not be given an uniform. Schools that are making a mockery of the system should be asked to return theirs.
  • xiggixiggi Registered User Posts: 25,441 Senior Member
    First, there are about 650 undergraduate liberal arts colleges in the US. That Dickinson is “generally” considered to be among the top 10% of those colleges (whether ranked by US News or not) hardly makes the college a “sore loser” nor does it make it irrelevant.

    With all due respect to Mr. Massa, I find the quoted statement worthy of a challenge.

    I have little doubt that Mr. Massa knows that the USNews has used the classifications of the Canegie Foundation to establish its categories. I believe that the number for the rankings was 215 or 217. Despite the recent reclassification by the Carnegie Foundation, I am not certain why Dickinson might opt to compare itself to 650 schools ... except to reinforce the general consideration of being among the top 10%. For that matter, why not decree that Dickinson is among the top 1 or 2% of all colleges in the country? In the meantime, I'd most curious to read more about the Liberal Arts Colleges that could be listed/ranked from 218 to 650, and ascertain if they are indeed national, regional, or local colleges.


    On the issue of classifications, here's how things are expected to change in the future at USNews:

    Baccalaureate Colleges. Although our criteria for subcategories are unchanged from the 2000 edition, we have discontinued the use of the “Liberal Arts” terminology in favor of a term that more transparently describes the classification criteria. (Both “liberal arts college” and “liberal arts education” signify more than the proportion of undergraduates who major in traditional arts and sciences fields.) Note that the new Undergraduate Instructional Program classification offers finer differentiation of the distribution of undergraduate majors, while also identifying institutions where arts and sciences and professional fields are represented among majors in roughly equal proportions.

    Because we increased the threshold level of master’s degree production separating Baccalaureate and Master’s institutions, some institutions that previously would have been classified among Master’s Colleges and Universities II are now included among Baccalaureate Colleges. Exclusively undergraduate institutions can be identified using the Enrollment Profile classification, and the Undergraduate Instructional Program classification can be used to determine the degree of correspondence between undergraduate and graduate programs.

    "How will U.S. News and World Report use the new classifications in its annual college rankings?

    Because the organizing framework of the U.S. News comparison groups is based on a conception of regional and national markets, and the Carnegie Classification does not use regional or national draw as a classification criterion, the Carnegie Foundation is hopeful that U.S. News will develop an approach to defining comparison groups that is better suited to its analytic framework.

    Robert Morse, Director of Data Research at U.S. News, has provided this statement:
    In late February 2006, Carnegie released the final "basic" classification of higher education institutions. There is not enough time for U.S. News & World Report to incorporate any of the finalized "basic" Carnegie Classification framework into the America's Best Colleges rankings that will be published in August 2006, called the 2007 edition of America's Best Colleges.

    U.S. News will continue to use its pre-existing categories and schools for the upcoming 2007 edition of the America's Best Colleges rankings to be published in August 2006.

    U.S. News will now have enough time to thoughtfully consider how to best use and incorporate the changes to the "basic" Carnegie Classifications for the following year's America's Best Colleges rankings, to be published in 2007 called the 2008 edition of America's Best Colleges."
  • arcadiaarcadia Registered User Posts: 2,538 Senior Member
    Xiggi,
    Every school (including Claremont McKenna College) has the ability to manipulate data. At least certain colleges (including Middlebury) make their common data set data available to the public. The same cannot be said for CMC.
  • CorbettCorbett Registered User Posts: 3,438 Senior Member
    As for tiers, I don't understand why ranking one school with, say, a score of 76 as "first tier" and another school with a score of 75 a "second tier" school could possibly be more useful than publishing the scores and the data points in rank order.
    Using "tiers" doesn't mean that you can't also have data points. In fact, this is exactly what US News does for Tiers 3 and 4 in their "National University" and "National Liberal Arts Colleges" rankings. They include all of the data points for the schools in these tiers, even though they are all "tied" for ranking purposes, and so are listed alphabetically. Obviously any rating system -- whether schools are ranked individually or in tiers -- should include the supporting data.

    US News implicitly acknowledges that many schools have rankings that are statistically indistinguishable: this is why all Tier 3 and Tier 4 schools are tied, and why many Tier 1 and Tier 2 schools are also tied (e.g. Caltech, Stanford and MIT are all "#4" this year).

    My feeling is that the US News ratings would be more realistic, and would better serve both the schools and the public, if the criteria for such equivalency were broader, so that there were a lot more tied schools in Tiers 1 and 2. If you look at the total scores, you can easily spot gaps that separate clusters of schools, and where you could draw lines between "subtiers" in Tiers 1 and 2.

    This approach would also create much more stability in the rankings; schools would be much less likely to fluctuate up or down in the rankings every year. This would arguably reflect reality more accurately than the current system.
  • xiggixiggi Registered User Posts: 25,441 Senior Member
    Arcadia, haven't I responded to that in a previous thread? I still find the lack of public posting of the CDS by Claremont McKenna unacceptable.

    However, the issue is not that the CDS is not made public; the issue is that the SAT optional policies and creative reporting by Middlebury have been misleading for years and STILL are--despite your claims that they changed in response to prior "complaints." Based on the reporting of a ridiculous number (745 I believe) as 75th percentile, I might concede that incompetence might play a bigger role than dishonesty, but it remains that the CDS forms and scores reported by Middlebury are simply inconsistent with the truth. As far as the other numbers, only the future will tell the degree of "model manipulation" that took place.

    Lastly, you also seem to miss the bigger issue: SAT optional schools such as Midd, Bates, Mt Holyoke, and others, should NOT be listed by USNews in the same category and compared to schools that do not have such policies.
  • arcadiaarcadia Registered User Posts: 2,538 Senior Member
    Xiggi,
    I do recall you saying that you dislike the fact that CMC doesn't make its data publically available. However, I don't recall you responding to WesDad's comment (reposted below):

    Okay, more US News anomaly/sloppiness (xiggi should like this one):

    2007 US News (class of 2009) lists CMC middle 50% SAT as 1310-1490.

    But CB and PR have it as 1270-1480.

    And, if you check, you'll find 1310-1490 were the figures in the 2006 US New edition, i.e., the Class of 2008.

    Perhaps someone forgot to update?
  • interesteddadinteresteddad Registered User Posts: 24,177 Senior Member
    My feeling is that the US News ratings would be more realistic, and would better serve both the schools and the public, if the criteria for such equivalency were broader, so that there were a lot more tied schools in Tiers 1 and 2. If you look at the total scores, you can easily spot gaps that separate clusters of schools, and where you could draw lines between "subtiers" in Tiers 1 and 2.

    I agree with you, Corbett. I would take it a step further and say that the really useful part of the USNEWS product is the sortable on-line database. This allows you to sort colleges by one of several different datapoints. For example, you can sort by "peer assessment" score. Interesting. Then, you can sort by admissions selectivity. Interesting. Or by median SATs. Interesting.

    A truly wonderful product would be to extend the list of sortable data points. For example, sort by per student endowment. Or by percentage of minority enrollment. Or by percentage qualifying by financial aid. Or by percentage of varisty athletes. Or fraternity members. Or by percentage of students getting PhDs. Or by distance to the nearest Ethiopian restaurant.

    Then, potential applicants could approach things in a rational, individual fit manner. What kinds of data points are important to that particular student? Sort the schools by those data points and see which ones rank highly on those.

    Where these ranking systems all fall apart is in trying to establish some kind of overall numeric rank. This false precision, based on one arbitrary set of parameters, is counterproductive to an informed college search.
  • corunnerdadcorunnerdad Registered User Posts: 43 Junior Member
    First of all, hooray for Dickinson. Almost makes me want to send my daughter there.

    As an economist who works with data, rankings, and a wide variety of statistical methods for a living, I am always amazed at the shallowness of many of the rankings that are put forward in the general media and used as the basis for articles. In the case of the US News Rankings, they use a reasonable method, but I think the motivation behind them is definitely an economic one. At one point, there was a lot less of this information generally available - especially with the ready accessibility now available - and US News found a very clever market niche and has done an excellent job promoting its publications. US News is clearly not Time or Newsweek, so this was a clever way to expand and this section of their business has become a 'Cash Cow' of sorts.

    My intuition around the Liberal Arts colleges on the list is that between around 6 and 79 this year there isn't really a whole lot of difference - every one of these schools is very good and provides a fine education. Whether or not it meets the needs of an individual student is an entirely different question.

    I think the broader classifications of Most Selective, More Selective, etc. are a lot more valuable as they don't tend to lend spurious accuracy to what is an inherently inaccurate process. Evaluating what school is best for a particular student is a very subjective process. In the end, there's only room for one choice, but there could be a number of places that would be 'bests'.

    When I think of school like St. Olaf or Illinois Wesleyan, for example, someone very active in the arts would be much better of there than at MIT - so the numeric comparisons made basically become irrelevant at that point.

    Looking at our recent trips to different colleges and my own senior's interests might be enlightening. My daughter is a decent distance runner and would like to go to a school where that sport is reasonably popular and taken with some degree of seriousness. She also is a serious student but feels that she would be best off at a school that is not too large. Colorado College is relatively near us and appears like a reasonable fit based on those criteria. A visit to the school left us wondering if the admission counselor had a long history of drug usage, if all of the students were as overly dramatic as the tour guide, and if the school was generally as disorganized as their admissions office seemed to be. Those are things that are difficult to quantify - but there's CC, well up on the list.

    Williams and Willamette (W obsession?) are also on my daughter's list. Well, there's Williams #1 in everything (#2, though, in Division III XC) and there's poor Willamette down the list a ways (a Top 20 DIII XC school). Despite the fact that a friend of mine went to Willamette undergrad and went on to get his PhD at Yale, there is this lurking monster in the back of my mind asking if I would be destroying my daughter's life if I encourage her toward poor old Willamette as opposed to Williams. Objectively, I think no but seeing these types of rankings causes me, at some visceral level, to wonder what type of horrible father I must be to do such a thing!

    A system that had Williams and Middlebury in a top tier with Willamette in a second tier would do a lot to diffuse this type of overreaction, which I have to imagine is even worse amongst those who don't possess a particularly quantitative bent for interpreting statistics in the first place.

    So on our family will go with the process and I'll hope my daughter never bothers to look at the US News rankings. They aren't necessarily bad and they are well thought through, but the bottom line is that if you left your decision to the rankings, you could be extremely misled about what type of educational situation would be the best fit for an individual.
  • vonlostvonlost Super Moderator Posts: 30,798 Super Moderator
    "Reed does not miss any chance to milk and exploit its position regarding the USNews rankings."

    Can anyone point me to where Reed has milked and exploited? xiggi cannot or will not (having been asked).

    "Reed ... non-standard policies that allow them to manipulate the data"

    Meaning that Reed's low ranking is due to their manipulation of the data?

    "Schools that do not want play ball should not be given an uniform."

    Reed agrees, and has asked USNWR to omit them from the rankings, to no avail.
  • interesteddadinteresteddad Registered User Posts: 24,177 Senior Member
    For a really simple "ranking" of colleges, I would suggest that per student endowment would probably tell you more than any other single indicator (real or arbitrarily derived). With few exceptions, it mirrors the results of all the USNEWS hocus pocus, except that it is much more intuitive to recognize that the differences between a school with $205k per student endowment and one with $210k are insignificant.

    Here's a partial list put together by the Questbridge folk:

    http://www.questbridge.org/resources/applying/endowment1.html

    Just pull the LACs from this list. The nice thing is that using a hard data index like this is that it removes the regional bias from the equation -- a bias that penalizes non-New England schools like Grinnell and Pomona in the USNEWS lists.
This discussion has been closed.