<p>The following was published today on the website of the Chronicle of Higher Education. It, like the subject, will no doubt fan the flames of the controversy. Anyway, here’s the fuel:</p>
<p>International Group Endorses Principles for Ranking of Higher-Education Institutions
By BURTON BOLLAG</p>
<p>An international group of educators, higher-education experts, and publishers that met in Berlin last month has come up with a set of principles for ranking colleges and universities. </p>
<p>The 16 principles of good practice, dubbed the “Berlin Principles on Ranking of Higher Education Institutions,” are a response to the explosion of college rankings in many countries since U.S. News & World Report published the first such listings in 1983. </p>
<p>College rankings – or league tables, as they are known in Britain – at first were widely dismissed in academe, but they appear to be widely used by students and their parents. </p>
<p>Some critics in academe have derided the whole concept of ranking institutions as a trivialization of higher education. Critics have also pointed to weaknesses in various ranking systems in different countries. </p>
<p>The Berlin principles are meant to serve as guidelines that groups that produce rankings are free to adopt. </p>
<p>In a statement accompanying the list of principles, the group wrote that the purpose of the guidelines was to ensure that “those producing rankings and league tables hold themselves accountable for quality in their own data collection, methodology, and dissemination.” </p>
<p>Jamie P. Merisotis, president of the Institute for Higher Education Policy, an independent group based in Washington, called the principles “the beginnings of a self-regulatory process.” </p>
<p>Among the principles are recommendations that rankings should: </p>
<p>Recognize the diversity of institutions and take the different missions and goals of institutions into account. </p>
<p>Be transparent regarding the methodology used for creating the rankings. </p>
<p>Measure outcomes, such as retention and graduation rates, in preference to inputs, such as entrance-examination scores, whenever possible. </p>
<p>Use audited and verifiable data whenever possible. </p>
<p>Provide consumers with a clear understanding of all of the factors used to develop a ranking, and offer them a choice in how rankings are displayed, such as by allowing them to determine how factors are weighed on interactive Web sites.
The meeting, which was attended by 47 people from a dozen countries, was organized by Mr. Merisotis’s group and the Unesco-European Centre for Higher Education, which is based in Bucharest, Romania. </p>
<p>“Both organizations are neutral on ranking,” said Mr. Merisotis, who was among the participants. But in the face of a growing number of annual rankings of colleges and universities – at least one, and sometimes many, are now published in more than 20 countries – the two organizations felt it was time to act. </p>
<p>“There needs to be some structure for a conversation” on improving the quality of rankings, said Mr. Merisotis, “particularly since there has been so much criticism about methodology.” </p>
<p>Jan Sadlak, director of the Unesco center in Bucharest, added that the principles are meant to improve what many academic leaders see as the superficial and capricious nature of rankings. “If we’re going to have to live with it, let’s do it in the least destructive way,” he said. </p>
<p>Robert J. Morse, who directs the college rankings at U.S. News & World Report and attended the Berlin meeting, says the publication has continually improved its ranking system over the past two decades. For example, criteria have been shifted to give more emphasis to outcomes and less to input measures. </p>
<p>A more recent ranking system, run by the Institute of Higher Education, at Shanghai Jiao Tong University, in China, is considered the most influential international ranking. It has been criticized for giving too much emphasis to Nobel Prizes won by faculty members, even decades earlier. Officials at the Chinese institute are paying attention to those concerns, said Mr. Sadlak, of Unesco. “There will be some corrections.” </p>
<p>Rankings have been criticized for placing pressure on administrators to make decisions based not on academic needs, but on what may boost their institution’s standing in the rankings. Nonetheless, Mr. Merisotis says they have become “the third leg of the quality-assurance stool, along with accreditation and government regulation and licensing.” </p>
<p>“Consumers like them,” he said. “They’re not going away.” </p>
<p>Last month’s meeting was the third since 2004 to work on finding ways to bring some order to the use of rankings. A fourth meeting is planned in fall 2007 in Shanghai. People attending that gathering, organizers say, will discuss the idea of establishing a system of certification of rankings that follow principles of good practice.</p>