<p>The Ivy League schools are not a collection of the eight best schools in the United States. They are probably the eight most prestigious schools but not the eight best, as shown in various fields of study. Other schools outrank the Ivys in many of these fields.</p>
<p>
</p>
<p>Well, my point more was that it would have been helpful to describe the purported purpose of the rankings and what the parameters indicate, rather than just throwing up “weighted and unweighted” in a difficult to read format. This paragraph is getting a lot closer.</p>
<p>I do know recently that there was a company that was extensively quoted here on CC and I think in the Chronicle of Higher Education that was providing its own graduate school rankings – through its own proprietary methodology – and selling its consulting services to universities and colleges that wanted to enhance their rankings. The primary focus was research productivity/quality. I am wondering if this is that.</p>
<p>
</p>
<p>UCSF is arguably the best biomedical research school on the West Coast. As a research med school it perenially outranks its rivals UWashington, Stanford, UCLA, UCSD, etc., and is the largest recipient of NIH funds in California by far. It’s just that it’s mission is much narrower than universities its being compared to. What would happen if CalTech had a med school? What if UCSF was called UC Berkeley’s med school as it is sometimes thought to be, kind of? Blah blah blah. You get it.</p>
<p>UCSF is arguably the best biomedical research school in the world, not just on the West Coast.</p>
<p>^^Yes — and UW-Madison and UC-Berkeley are two of the universities that in terms of overall quantity of academic impact put almost every other university in the world, much less just every Ivy, to shame.</p>
<p>Of course, academic impact doesn’t = great undergrad.</p>
<p>Or in UCSF’s case, no undergrad.</p>
<p>No, but it means they have smart productive faculty doing some good things–and that is a large part of what being a professor is supposed to be about–adding to the knowledge base. That’s why you have to do original research to get a PhD.</p>
<p>So is this ranking a gauge of how good the research from each University?</p>
<p>^^Of course, thethoughtprocess. Although Berkeley and Wisconsin would almost definitely rank among the top universities in the world in <em>overall</em>research impact (which is a mix of quantity and quality, like is shown in my adjusted rankings here but also others), they wouldn’t necessarily be at the top in research <em>quality</em> for any particular department. The original COHE ranking has Harvard, MIT, Caltech, Yale and UCSF at the top for research quality, as you can see above, even though those schools are all smaller than, and do a smaller quantity of research than, Berkeley or UW-Madison.</p>
<p>Additionally, based on talking extensively with students at professors at both of those schools, I would also not place them among my list of the top 40 undergraduate programs in the country. Neither would many statistical measures that have to do with the quality of the entering class, admission rates to top graduate schools, etc. For that realm, I’d go more with something like this, even though it too is limited: <a href=“http://www.wsjclassroomedition.com/pdfs/wsj_college_092503.pdf[/url]”>http://www.wsjclassroomedition.com/pdfs/wsj_college_092503.pdf</a> (as you can see, Berkeley’s undergrad actually comes in #41)! </p>
<p>Another measure strongly related to undergraduate quality is the amount of research or teaching per student. That’s expressed in measures such as the student to faculty ratio, especially if calculated for the most popular departments and not in “gross” like USNWR, and the amount of research expenditure per student; example: <a href=“http://talk.collegeconfidential.com/showpost.php?p=4160917&postcount=4[/url]”>http://talk.collegeconfidential.com/showpost.php?p=4160917&postcount=4</a> , and many others. All that leads to the link above in which I explained how you might go about choosing a particular undergrad program.</p>
<p>Thanks-sorry that I missed the date earlier.</p>
<p>This is as I thought–it’s the Faculty Scholarly Productivity Index, which was developed by Academic Analytics. It is not a Chronicle project or a Chronicle product, nor endorsed by the Chronicle. It’s a product that Academic Analytics sells to colleges; which is to say that for a fee they will provide an institution with the reports for all of its departments. </p>
<p>The Chronicle reported on it because Academic Analytics gave the Chronicle wide access to its reports, and because this is right up the Chronicle’s alley. But it is wrong to say that this is “The Chronicle Ranking.” The Chronicle also reports on the USNews rankings, the NRC rankings, the Philosophical Gourmet rankings, etc as well (when they have newsworthy aspects to them). These rankings were newsworthy because they were new, and because Academic Analytics had just done a big marketing push where they sent sample reports to a number of college presidents. </p>
<p>I know this is a tangent but I think it’s important to clear this up. These are not “Chronicle rankings” but rather outside rankings that the Chronicle reported on.</p>
<p>^Good clarification to keep in mind, hoedown, thanks. However, unlike the USNews, NRC or Philosophical Gourmet rankings, or any others for that matter, these rankings are posted by the Chronicle as an actual “data” source, right alongside all the other data that the Chronicle itself collects on endowments, library sizes, enrollment, graduation rates, etc. That’s why a lot of people refer to them as the “Chronicle” rankings.</p>
<p>The Chronicle website presents it as Data, not as a news story…is that an endorsement?</p>
<p>PosterX, for many schools, there is a strong correlation between the Chronicle site ranking and the WSJ feeder ranking, but for others there is quite a disparity. I wonder what that means.</p>
<p>
</p>
<p>You mean they include them in the Almanac? I foresee a huge stink being raised about that, if it hasn’t already been. I read the daily digests so I don’t follow the letters.</p>
<p>They are listed under Chronicle of Higher Education’s “Facts & Figures” and “Data,” along with about 25 other Chronicle surveys covering everything from graduate student stipends to National Merit Scholars. They are not in the 2006-2007 Almanac of Higher Education, which is a bit more limited in scope.</p>
<p>Hoedown, the data is listed under “facts and figures” so I sort of see that as an endorsement.</p>
<p>The ranking is how active the professors are in terms of publications, grant money, and awards and honors. I really think this is a valuable resource when looking at most active scholars (not necessarily the best).</p>
<p>However, as I usually end up saying, I don’t think professor accolades and activity such as this is necessarily a good indicator of how valuable those professors are to undergraduate students.</p>
<p>By golly, you’re right! I couldn’t be more surprised!</p>
<p>I understand the ranking quite well, but I think it has some serious flaws, i.e. the method of counting of awards, and the weighting they give different forms of publication, which is not appropriate for some fields. Some institutions have declined to use this “ranking” as a measure of their faculty’s productivity for these reasons. </p>
<p>Plus, as a commercial product, I can’t believe the Chronicle publishes it in this way. I mean, I can believe it (you pointed my eyes right to it) but I think it is inappropriate.</p>