US News rank 2006 versus Gourman Report rank 1967 versus Gourman Report rank 1997

<p>I found a copy of the first Gourman Report from 1967. It includes LACs and rates colleges on an 800 point scale.</p>

<p>I wanted to see how the Gourman Report ratings have changed over the years. The numerical ratings often changed a little from 1967 to 1997 but usually not enough to change the rank significantly. There were some exceptions.</p>

<p>I also wanted to see how the Gourman rank compared to US News rank.</p>

<p>I haven’t had a chance to look this over carefully. Let me know if you see anything interesting.</p>

<p>school, US News 1997, US News 2006, Gourman rank 1967,Gourman rank 1987,Gourman rank 1997
universities<br>
Harvard 3 1 2 2 2
Princeton 2 2 1 1 1
Yale 1 3 3 4 5
U Penn 13 4 11 14 17
Duke 4 5 22 17 19
Stanford 6 6 9 5 6
Caltech 9 7 12 12 14
MIT 5 8 15 11 13
Columbia 11 9 5 15 15
Dartmouth 7 10 10 18 22
Washington U 17 11 37 38 29
Northwestern 9 12 19 13 16
Cornell 14 13 7 8 7
Johns Hopkins 15 14 21 21 21
Brown 8 15 17 16 20
U Chicago 12 16 6 7 9
Rice 16 17 18 24 24
Notre Dame 17 18 8 23 18
Vanderbilt 20 19 28 32 30
Emory 19 20 32 41 41
UC Berkeley 27 21 13 6 8
CMU 28 22 23 28 25
Georgetown 23 23 40 43 42
UVA 21 24 36 33 31
UCLA 31 25 14 10 12
U Michigan 24 26 4 3 3
Tufts 22 27 33 36 x
UNC CH 25 28 29 25 28
Wake Forest 25 29 49 48 x
USC 43 30 43 47 44
Wm & Mary 33 31 47 49 x
Lehigh 32 32 46 42 45
UC SD 34 33 x 50 26
Brandeis 29 34 31 19 39
U Rochester 30 35 24 34 36
U Wisconsin 41 36 16 9 11
Case Western 38 37 35 40 41
Georgia Tech 48 38 30 35 37
NYU 35 39 27 26 34
Boston C 38 40 48 46 x
UC Irvine 37 41 x 31 32
U Illinois 50 42 26 20 23
RPI 51 43 34 39 40
Tulane 36 44 25 37 x
UC SB 46 45 42 30 38
U Washington 42 46 20 22 27
Yeshiva 45 47 39 x 46
Penn State 51 48 44 29 33
UC Davis 40 49 41 27 35
Syracuse 44 50 45 44 x
U Florida 51 51 38 45 43</p>

<p>liberal arts colleges<br>
Williams 3 1 4 9 8
Amherst 2 2 6 1 2
Swarthmore 1 3 3 2 7
Wellesley 4 4 13 26 x
Carleton 9 5 10 34 x
Bowdoin 8 6 31 33 x
Pomona 5 7 7 6 5
Haverford 6 8 5 4 x
Middlebury 7 9 21 46 x
Claremont 15 10 18 7 6
Davidson 11 11 22 27 x
Wesleyan 14 12 8 16 x
Vassar 17 13 30 20 x
Wash&Lee 13 14 27 44 x
Colgate 20 15 15 3 3
Grinnell 16 16 29 32 x
Hamilton 25 17 32 25 x
Harvey Mudd x 18 17 11 x
Smith 12 19 12 10 x
Colby 18 20 45 41 x
Bates 22 21 43 31 x
Bryn Mawr 10 22 16 5 4
Mt Holyoke 19 23 25 12 x
Oberlin 24 24 1 15 x
Macalester 32 25 34 36 x
Trinity (CT) 21 26 24 17 x
Barnard 23 27 14 19 x
Bucknell 31 28 35 28 x
Colorado C 28 29 23 39 x
Lafayette 39 30 26 24 x
Scripps 36 31 33 29 x
Holy Cross 27 32 38 13 x
Kenyon 33 33 11 30 x
Sewanee 29 34 9 47 x
U Richmond 42 35 40 42 x
Connecticut C 26 36 36 45 x
Union 34 37 48 8 x
Whitman 41 38 28 37 x
Bard 40 39 49 43 x
Franklin&Marshall 34 40 37 48 x
Centre 41 41 46 49 x
Furman 41 42 41 40 x
Occidental 38 43 19 21 1
Skidmore 41 44 44 18 x
Dickinson 41 45 39 38 x
Rhodes 41 46 x 35 x
Gettysburg 41 47 42 50 x
Reed 37 48 2 14 x
DePauw 41 49 20 23 x
Sarah Lawrence 29 50 47 22 x</p>

<p>U Michigan 24 26 4 3 3</p>

<p>w t f?</p>

<p>do you have it in excel format?</p>

<p>Actually, Michigan was tied at #25 in the 2006 USNWR! hehe</p>

<p>Gourman tends to favor large research universities which explains why Michigan, Cal, Wisconsin and UCLA are all ranked among the top 10 nationally.</p>

<p>yes I have it in excel format</p>

<p>actually UCSD is tied for 32nd on 2006 US News not 33rd.</p>

<p>Collegehelp, there doesn’t seem to be a #4 or a #10 in the 1997 Gourman Report rank. Michigan is #3, Stanford #5 but there is no #4. And Chicago is #9 and Wisconsin is #11, but there is no #10.</p>

<p>I found it very surprising that Notre Dame was #8 in the 1967 Gorman report when it certainly seems to have improved dramatically academically over the last generation and yet has dropped in the '87 and '97 Gorman reports. </p>

<p>I also found it interesting that three similar schools like Colgate, Middlebury, Bowdoin can be ranked so disparately (3, 46, 33) in the Gorman report and that Union comes in #8 and Occidental #1 in Gorman’s '87 and '97 report. What can this possibly be based on? </p>

<p>There is also a huge slanting of non east coast schools in the 1967 Gorman compared to the later editions.</p>

<p>Alexandre, you are right. Gourman 1997 data lacks a #4 and #10 and has two #41. The 1997 Gourman ranks were off by 1 or 2 after those ommissions. Here is a corrected list, sorted by the 2006 US News:</p>

<p>school, US News 1997, US News 2006, Gourman rank 1967, Gourman rank 1987, Gourman rank 1997
universities<br>
Harvard 3 1 2 2 2
Princeton 2 2 1 1 1
Yale 1 3 3 4 4
U Penn 13 4 11 14 15
Duke 4 5 22 17 17
Stanford 6 6 9 5 5
Caltech 9 7 12 12 12
MIT 5 8 15 11 11
Columbia 11 9 5 15 13
Dartmouth 7 10 10 18 20
Washington U 17 11 37 38 27
Northwestern 9 12 19 13 14
Cornell 14 13 7 8 6
Johns Hopkins 15 14 21 21 19
Brown 8 15 17 16 18
U Chicago 12 16 6 7 8
Rice 16 17 18 24 22
Notre Dame 17 18 8 23 16
Vanderbilt 20 19 28 32 28
Emory 19 20 32 41 40
UC Berkeley 27 21 13 6 7
CMU 28 22 23 28 23
Georgetown 23 23 40 43 41
UVA 21 24 36 33 29
UCLA 31 25 14 10 10
U Michigan 24 26 4 3 3
Tufts 22 27 33 36 x
UNC CH 25 28 29 25 26
Wake Forest 25 29 49 48 x
USC 43 30 43 47 43
Wm & Mary 33 31 47 49 x
Lehigh 32 32 46 42 44
UC SD 34 33 x 50 24
Brandeis 29 34 31 19 37
U Rochester 30 35 24 34 34
U Wisconsin 41 36 16 9 9
Case Western 38 37 35 40 38
Georgia Tech 48 38 30 35 35
NYU 35 39 27 26 32
Boston C 38 40 48 46 x
UC Irvine 37 41 x 31 30
U Illinois 50 42 26 20 21
RPI 51 43 34 39 39
Tulane 36 44 25 37 x
UC SB 46 45 42 30 36
U Washington 42 46 20 22 25
Yeshiva 45 47 39 x 45
Penn State 51 48 44 29 31
UC Davis 40 49 41 27 33
Syracuse 44 50 45 44 x
U Florida 51 51 38 45 42</p>

<p>What does x mean? That it isn’t in the top 50? If that’s the case, it certainly calls into question the veracity of a ranking where a very good school like Haverford can drop from #4 to out of the top 50.</p>

<p>gellino- x means the information is missing. In 1997, the Gourman report did not give the numerical score for every college, only for the top 100. Very few LACs make the top 100 in the Gourman Report. I was able to assign a Gourman rank to all the LACs for 1967 and 1987 because the numerical ratings were provided for all the LACs. Haverford might have been ranked high among LACs in 1997 too, but I have no way of knowing. The x does not necessarily mean that the school was below the top 50.</p>

<p>Wow thanks for posting these collegehelp, they really are insightful. I personally think its useful especially when looking for certain inconsistencies. For example UPenn’s rankings are 13 4 11 14 15. Now I think UPenn is a great school but I don’t think it should be ranked 4th in the US ahead of Stanford, MIT etc. Just for the record I don’t want to start this into another ‘why UPenn is ranked so high’ thread I’m just trying to point out certain ways these rankings can be used.</p>

<p>It’s also interesting to see that for some schools, the U.S. News rankings have been consistent over the years and so have the Gourman rankings, but the two systems produce very different numbers. Cornell (with a better Gourman ranking) and Georgetown (with a better U.S. News ranking) are examples.</p>

<p>Does anybody know how the two sets of ranking criteria differ?</p>

<p>The Gourman Report states that its ratings are based on “extensive research” into the following criteria:</p>

<ol>
<li>auspices, control, and organization of the institution</li>
<li>numbers of educational programs offered and degrees conferred (with additional attention to “sub-fields” available to students within a particular discipline</li>
<li>age (experience level) of the institution and the individual discipline or program and division</li>
<li>faculty, including qualifications, experience, intellectual interests, attainments, and professional productivity (including research)</li>
<li>students, including quality of scholastic work and records of graduates both in graduate study and in practice</li>
<li>basis of and requirements for admission of students (overall and by individual discipline)</li>
<li>number of students enrolled (overall and for each discipline)</li>
<li>curriculum and curricular content of the program or discipline and division</li>
<li>standards and quality of instruction (including teaching loads)</li>
<li>quality of administration, including attitudes and policy towards teaching, research and scholarly production in each discipline, and administration research</li>
<li>quality and availability of non-departmental areas such as counseling and career placement services</li>
<li>quality of physical plant devoted to undergraduate, graduate, and professional levels</li>
<li>finances, including budgets, investments, expenditures and sources of income for both public and private institutions</li>
<li>library, including number of volumes, appropriateness of materials to individual disciplines and accessibility of materials</li>
<li>computer facility sufficient to support current research activities for both faculty and students</li>
<li>sufficient funding for research equipment and infrastructure</li>
<li>number of teaching and research assistantships</li>
<li>academic-athletic balance</li>
</ol>

<p>The weight given to each criterion above varies by discipline.</p>

<p>I had to jump in here on this discussion of the Gourman reports (as I’ve done once before on cc).</p>

<p>Not only are his most recent ratings 10 years old, but the publications ranking different programs authored by Jack Gourman have been repeatedly panned and dismissed by reputable sources, such as The Chronicle of Higher Education. For example, when Mr. Gourman has been asked why he refuses to reveal his methodologies behind his so-called rankings, he has repeatedly said merely that the only people who are concerned with methodology are reporters. That’s certainly news, since I know that I am wondering how he came up with these rankings. According to the many critiques of the Gourman reports (including many from schools that did rather well in his surveys), college officials seem particularly curious about how Mr. Gourman came up with his results, noting that he has never contacted any of the institutions on his lists for information. In fact, in several of his lists, he has included programs or departments that simply don’t exist. </p>

<p>No one knows for sure what research Mr. Gourman did to come up with his rankings. Did he do more than search his feelings for answers? Other surveys and rankings that are out there, including U.S. News and World Report’s rankings of colleges, professional programs and graduate programs (whether or not you agree with their rankings), the Vault and others all actually take the time to contact the schools, programs or employers in question, and do actual surveys using statistically significant measures to determine their rankings. You may not agree with the weight given to one factor or another, but all of these surveys and rankings put their methodologies out there for anyone to inspect. Mr. Gourman does nothing of the kind. Do you have any idea what weight Mr. Gourman gave to the quality of the faculty, the breadth of the courses offered, the recruiting and job placement atmosphere or the quality of the student body (however that is measured) at any of his ranked schools? You can’t. I can’t either because Mr. Gourman thinks we are too dumb to care about or understand why he made his choices – or, according to Mr. Gourman, because only reporters care about such things. </p>

<p>Please approach Mr. Gourman’s rankings with a critical eye and take them for what they are worth. Many very reliable scholarly sources have suggested that his numbers are nothing but one man’s unresearched opinion (save his claim to have made some phone calls to professors), so it’s up to you to decide whether you trust Mr. Gourman’s intuition.</p>

<p>The criteria that were used in the Gourman Report are listed in post #14. The Gourman book states that they did contact colleges. The idea that they did not contact colleges is a myth, I believe. The Gourman book is fairly explicit about methodology.</p>

<p>The Gourman Report classifies programs together that are similar in nature even though they may have very different names. At the bottom of each page in the Gourman Report there is a list of the different program names that are classified together on that page. I think the misconception that Gourman rated nonexistent programs stems from the fact that he used generic headings for programs with widely different names.</p>

<p>Regarding the weight given to various criteria: Gourman states that the weights depend on the program. It makes sense to weight the criteria differently for English and for Engineering, for example.</p>

<p>In any case, the Gourman rankings are valid. College quality changes very slowly and significant movement relative to other colleges is rare. They largely agree with US News rankings and they are in rough agreement with what experienced posters on CC have to say. If the Gourman rankings are valid, it matters little what the method was.</p>

<p>It is certainly up to you whether to take the Gourman reports at face value (which you seem to have done in many prior posts). However, you must at least acknowlege that each scholarly publication that has examined his rankings, and several universities included in his rankings, have openly questioned his methodologies or lack thereof. There is nothing explicit about his so-called methodology. Though the Gourman book may state that he did contact colleges, no college has ever acknowledged actually having been contacted by Mr. Gourman. Furthermore, Mr. Gourman himself in later interviews acknowledged that he never contacted any official college sources, but rather spoke to selected professors at certain colleges. That is hardly the same thing. </p>

<p>Any rating or ranking system is valid only if is specifies what each of the criteria are for that system, and while some of those criteria may be subjective, or some of those criteria may differ from subject area to subject area, you must have the ability to examine what is going on behind the numbers. Mr. Gourman gives no such information. For all any of us know, Mr. Gourman woke up one day and wrote down the first thing that popped into his head. You certainly can’t say that he didn’t do just that. The method of determining the results is actually the key to the validity of any such rating or ranking system. Without a method, what you end up with is simply one person’s educated (or perhaps uneducated) opinion.</p>

<p>There are some things in the Gourman Report that suggest it was well researched. It recognizes U Delaware for Chem Eng and Claremont McKenna for Econ, for example, things that are not widely known and unlikely to be based on intuition. </p>

<p>The weakness is that it uses criteria that favor research universities. It would have been better to have separate ratings for LACs. Some people rightly fault the Gourman Report for this. I was pleased to discover that the 1967 edition has data for all the LACs.</p>

<p>An interesting thing about the 1967 edition is that the pages consist of forms that are filled in by hand (handwritten) and includes crossed-out corrections. Every college is rated like a report card in various subjects… an “A” in psychology, a “C” in physics, for example. Someone seems to have put some thought into the process and made revisions.</p>

<p>Sallyawp, most members of academe, and scholarly publication, mock rankings because intellectuals believe that universities cannot be simplistically ranked. Can you name me one ranking that is actually respected by the academic world. At least Gourman’s subject rankings are almost identical to those of the NRC and the USNWR graduate rankings, so there must be some accuracy in there. </p>

<p>The difference between Gourman and say the USNWR is that the uneducated masses chose not to believe the former and to revere the latter.</p>

<p>I’m certainly no big fan USNWR or any other rankings, though they do have their uses. All I’m suggesting is that one needs to approach the Gourman reports with some skepticism, just as I would suggest that one should approach information published on a random website with skepticism, unless and until you know your source. You don’t have to agree with the methods that USNWR uses to arrive at their rankings, but at least you can see the methodical process that got them there. You can decide that you disagree with one or more factors that USNWR chose to emphasize in their rankings, and then address the judgments made by USNWR logically. You cannot do the same for Mr. Gourman’s reports. </p>

<p>You can choose to simply agree with Mr. Gourman’s judgments, however he arrived at them. That’s absolutely fine. Truly. My statements have nothing to do with reverence for USNWR and disdain for Mr. Gourman. I just hope that people will consider the source, the reputation of the source and the methodologies used by that source to arrive at a determination logically. Just for the record, this approach would apply equally to a news article on a controversial topic on CNN.com or an advertising pitch by a large company, for example.</p>