<p>On graduate study, at the top schools that receive most of the attention on CC, most students go on to some sort of advanced degrees. Many may not be doctoral programs in their fields of study, but medical school for bio majors, or business school for economics majors certainly should count.</p>
<p>The real problem with this method it that it uses unproven proxies for indices of quality. Assume that students are attracted to top departments, so give the departments credit for having large numbers of majors. Assume that having lots of professors is good, so give the departments credit for that. Assume that the distribution of SAT scores across the departments is uniform, so give a department credit for high average scores at the school. What about a place like CMU, where the overal SAT average is lower than some other top universities, but the average in the SCS is astronomical? Do you penalize the SCS for the lower scores of the artists, do you assume that the college of fine arts is even better than it is because of the computer geeks? Is the SAT score even remotely meaningful for the quality of the drama department?</p>
<p>The real tests of department quality would be outcome measures- how much do the students learn in their fields? Since there are few direct tests of this, how about outcome-based proxies? For many fields, the PhD production would be relevant for comparting one college to another. A strong record of PhD production means that the students are talented enough to make it through a doctoral program, and that their college experience both prepares them for graduate school and makes them want to take that route. If one added professional school attendance for those fields in which this is relevant, then the result would be an outcome measure that did reflect the experience of most students at top colleges. Of course, this would not be useful at all for fields in which further education is rare, and of little importance, again the art, drama, and music areas come up for people who will be performers, not academics. For these fields one would need other evidence of career success- are they working in their fields, tenured in top orchestras, winning acting awards, having major art exhibitions. Very hard to get data.</p>
<p>To focus on top students at top colleges, look at the number who win prestigious academic awards- Putnam competition, NSF fellowships, etc. Checking how many undergrads publish research papers would be helpful, but this data is almost impossible to obtain.</p>
<p>The NSSE is entirely proxies. The things it asks sound like they should be important, but do people who have these sorts of engagement with the faculty and their studies really end up better off for it? Is there proof? This also would penalize the technical schools, and those broader universities with large numbers of technical majors, since engineering tends to score low on these measures, but places like MIT turn out critical thinkers in huge numbers.</p>