New Ranking Criteria

<p>The following was published today on the website of the Chronicle of Higher Education. It, like the subject, will no doubt fan the flames of the controversy. Anyway, here’s the fuel:</p>

<p>International Group Endorses Principles for Ranking of Higher-Education Institutions
By BURTON BOLLAG</p>

<p>An international group of educators, higher-education experts, and publishers that met in Berlin last month has come up with a set of principles for ranking colleges and universities. </p>

<p>The 16 principles of good practice, dubbed the “Berlin Principles on Ranking of Higher Education Institutions,” are a response to the explosion of college rankings in many countries since U.S. News & World Report published the first such listings in 1983. </p>

<p>College rankings – or league tables, as they are known in Britain – at first were widely dismissed in academe, but they appear to be widely used by students and their parents. </p>

<p>Some critics in academe have derided the whole concept of ranking institutions as a trivialization of higher education. Critics have also pointed to weaknesses in various ranking systems in different countries. </p>

<p>The Berlin principles are meant to serve as guidelines that groups that produce rankings are free to adopt. </p>

<p>In a statement accompanying the list of principles, the group wrote that the purpose of the guidelines was to ensure that “those producing rankings and league tables hold themselves accountable for quality in their own data collection, methodology, and dissemination.” </p>

<p>Jamie P. Merisotis, president of the Institute for Higher Education Policy, an independent group based in Washington, called the principles “the beginnings of a self-regulatory process.” </p>

<p>Among the principles are recommendations that rankings should: </p>

<p>Recognize the diversity of institutions and take the different missions and goals of institutions into account. </p>

<p>Be transparent regarding the methodology used for creating the rankings. </p>

<p>Measure outcomes, such as retention and graduation rates, in preference to inputs, such as entrance-examination scores, whenever possible. </p>

<p>Use audited and verifiable data whenever possible. </p>

<p>Provide consumers with a clear understanding of all of the factors used to develop a ranking, and offer them a choice in how rankings are displayed, such as by allowing them to determine how factors are weighed on interactive Web sites.
The meeting, which was attended by 47 people from a dozen countries, was organized by Mr. Merisotis’s group and the Unesco-European Centre for Higher Education, which is based in Bucharest, Romania. </p>

<p>“Both organizations are neutral on ranking,” said Mr. Merisotis, who was among the participants. But in the face of a growing number of annual rankings of colleges and universities – at least one, and sometimes many, are now published in more than 20 countries – the two organizations felt it was time to act. </p>

<p>“There needs to be some structure for a conversation” on improving the quality of rankings, said Mr. Merisotis, “particularly since there has been so much criticism about methodology.” </p>

<p>Jan Sadlak, director of the Unesco center in Bucharest, added that the principles are meant to improve what many academic leaders see as the superficial and capricious nature of rankings. “If we’re going to have to live with it, let’s do it in the least destructive way,” he said. </p>

<p>Robert J. Morse, who directs the college rankings at U.S. News & World Report and attended the Berlin meeting, says the publication has continually improved its ranking system over the past two decades. For example, criteria have been shifted to give more emphasis to outcomes and less to input measures. </p>

<p>A more recent ranking system, run by the Institute of Higher Education, at Shanghai Jiao Tong University, in China, is considered the most influential international ranking. It has been criticized for giving too much emphasis to Nobel Prizes won by faculty members, even decades earlier. Officials at the Chinese institute are paying attention to those concerns, said Mr. Sadlak, of Unesco. “There will be some corrections.” </p>

<p>Rankings have been criticized for placing pressure on administrators to make decisions based not on academic needs, but on what may boost their institution’s standing in the rankings. Nonetheless, Mr. Merisotis says they have become “the third leg of the quality-assurance stool, along with accreditation and government regulation and licensing.” </p>

<p>“Consumers like them,” he said. “They’re not going away.” </p>

<p>Last month’s meeting was the third since 2004 to work on finding ways to bring some order to the use of rankings. A fourth meeting is planned in fall 2007 in Shanghai. People attending that gathering, organizers say, will discuss the idea of establishing a system of certification of rankings that follow principles of good practice.</p>

<p>Here’s another take on the subject, from an American POV. This was from another Chronicle article published two months ago:</p>

<p>Another Accountability Idea: a New Database That Would Customize College Rankings
By KELLY FIELD</p>

<p>While testing of students is the most publicized piece of the Commission on the Future of Higher Education’s approach to accountability for colleges, it is only one piece. </p>

<p>The panel is also working on a concept for a database that would allow consumers to rank colleges based on variables of their choosing, and it intends to endorse a controversial plan to create another database to track the educational progress of every college student in the United States. </p>

<p>The rankings database, which Charles Miller, chairman of the commission, envisions as an alternative to the U.S. News & World Report college rankings, will contain information from the Education Department’s Integrated Postsecondary Education Data System survey, which includes data on enrollment, institutional revenue and expenditures, tuition, and other key indicators, as well as information on institutional performance. </p>

<p>It will not include students’ scores on any specific standardized test, like the Collegiate Learning Assessment, though Mr. Miller says that information could be incorporated into the database if a test emerged as the national standard. </p>

<p>Unlike U.S. News & World Report, the database will not rank colleges – Mr. Miller says the Education Department “is not in the ranking business” – but it will allow users to generate personal rankings based on the weights they assign to variables like total enrollment, student aid, and graduation rates. </p>

<p>“You could create an infinite number of rankings, and then all of a sudden, it’s no longer a monopoly,” he says. “It won’t be in the hands of one publisher who won’t tell you how they got the rankings.” (Robert J. Morse, director of data research for U.S. News & World Report, says the magazine has given out such detailed explanations of the rankings “that many people in higher ed have produced simulated models.”) </p>

<p>While the idea of a rankings database is likely to appeal to colleges, the commission’s plan to endorse a proposed Education Department system that would collect individual student data and track students’ progress is more divisive. Supporters of such a "unit system, including lobbyists for state colleges, say it would allow the government to better track transfer students and calculate an institution’s net price, or what students actually pay after financial aid is taken into account. </p>

<p>Opponents of unit records, including private colleges and privacy-rights groups on both the left and the right, say the system – which would utilize students’ Social Security numbers – is fraught with possible security problems. </p>

<p>The commission, however, “is convinced that there is good technology out there that can provide privacy guarantees,” said Charles M. Vest, a commission member and president emeritus of the Massachusetts Institute of Technology, at a recent public hearing in Boston. </p>

<p>Education Department officials first offered the idea of a unit-record system in 2004, but that idea was rejected by members of Congress from both political parties. Asked in an interview if he was beating a dead horse, Mr. Miller said he was not focusing on the politics of the proposal. </p>

<p>“Our job is to provide recommendations that help the system get better,” he said.</p>

<p>Interesting thread.

About time! College is not one-size-fits-all, after all. Sad to say, there are more than a few kids who foolishly choose their college based on perceived differences in rankings. </p>

<p>On the graduate school level, the [NRC</a> rankings](<a href=“http://www.stat.tamu.edu/~jnewton/nrc_rankings/nrc41.html]NRC”>NRC Rankings in Each of 41 Areas) were published to rank programs. However accurate these rankings may be, I much prefer <a href=“http://www.phds.org/rankings/[/url]”>http://www.phds.org/rankings/&lt;/a&gt;. Although it uses the NRC information, you can use your own criteria to come up with a specialized ranking. I think making a similar site for US News would be great for high school students!</p>

<p>I agree with the proposal to make a site similar to the one in the second post; I certainly have different factors in mind when selecting a school than they do, and I’d like a ranking system like that.</p>

<p>Too bad it’ll be too late for the class of '11.</p>