CSM: College presidents plan 'U.S. News' rankings boycott

<p>There is an interesting opinion piece in Inside Higher Ed. by Richard Hersh, former pres. of Hobart and William Smith, current pres. of Trinity College. President Hersh, a prime mover in the collective action protests against the USNWR rankings, makes an important link between the controversy over ranking systems and Federal assessment that is spurring the call for increased accountability, transparency, and the development of new metrics aimed to measure educational “quality”.</p>

<p>

</p>

<p><a href=“http://www.insidehighered.com/views/2007/04/24/hersh[/url]”>http://www.insidehighered.com/views/2007/04/24/hersh&lt;/a&gt;&lt;/p&gt;

<p><a href=“http://www.csmonitor.com/2002/1001/p11s02-lehl.html[/url]”>http://www.csmonitor.com/2002/1001/p11s02-lehl.html&lt;/a&gt;&lt;/p&gt;

<p>Hersh’s call is for colleges to act before the federal government takes it upon itself to regulate (my word, not his) accountability and quality standards, a train wreck he calls “No College Left Behind”.</p>

<p>What is not discussed in the article, but should be:
The problem for institutions of higher education is that in the eyes of the federal government there are no private colleges, or at least there are precious few. Any institution that receives the benefit of federal funds, even if indirectly, would likely be forced to comply or face the loss of that funding, including student and parental acceptance of government-backed loans and tax credits. Yale law school can’t ban army recruiters without losing federal funding for the entire institution. Federal accountability standards for educational quality would undoubtedly be treated the same way. How many institutions would be willing to fore go federal assistance to preserve the integrity of their mission?</p>

<p>Self-regulation by the IHE’s is a far better and less burdensome approach, but I still see the potential for compromising mission. “And it is not uncommon to hear faculty and administrators across the country protest … that the diversity of college and university missions precludes one-size-fits-all assessment…” I would agree with that argument. However, the danger with, “one-size-fits-all assessment”, is that it would tend to breed conformity as institutions seek to remain in the good graces of this new accountability/accrediting entity. College’s would start to look like chain restaurant franchisees with slightly different decor, but the same menu.</p>

<p>From the same article: </p>

<p>

</p>

<p>In so many words, we can thank Richard Hersh for pointing out what a mere review of the Peer Assessment reveals: Asking the “market place” to follow its “trustworthy” leaders blindly and without accountability will continue to be challenged. </p>

<p>The hardest part remains how to best measure the elusive quality in education. It is possible --and USNEWS does a good job at it-- to measure selectivity, graduation rates, pedigrees of faculty, sizes of classes, and financial resources. A good number of the listed elements, however, do track the wealth of the schools, if not the wealth of the students. Should we not also measure the schools that do a lot more for a lot less? The unflappable belief that the more we spend on education the better it is might turn into a quest for pyrite. </p>

<p>We only have to look at high schools led by drunken sailors and mobsters waving placards of “Mo’ money and less accountability.” All the while the real quality teaching is often delivered at a fraction of the costs by the schools that have maintained a sense of integrity and responsibility.</p>

<p>siserune,
I understand your concerns about the quality and objectivity of the responses that one would get from surveying students, alumni, and recruiters, however I don’t conclude that this difficulty makes the exercise useless. The NSSE surveys provide some guidance on the type of questions that can be asked of students and surely similar questionnaires can be created for the audiences of alumni and corporate recruiters. </p>

<p>I strongly agree with your suggestions for the collection of quantitative data such as “postgraduate outcomes in employment, salary, debt and grad school admission.” Such numbers would be of great value to high school students as they make their college selections and might help eliminate a lot of the fog that surrounds the “prestige” schools and whether their brands truly deliver higher value outcomes.</p>

<p>standrews,
To use your analogy of restaurants, there are many types of restaurants as there are many types of colleges. Expensive restaurants, cheap restaurants. Mexican restaurants, Chinese restaurants, Italian restaurants, Indian restaurants, etc. Fast-food restaurants, take-out restaurants, theme restaurants. And on and on. Yet consumers can differentiate among them and make judgments about the quality of their offerings even while those consumers have a vast assortment of restaurant tastes. Colleges have likewise created institutional brands-Liberal Arts colleges, Engineering colleges, State Universities, Community colleges, All-girls colleges, etc. It is not that hard to make measurements among (and perhaps even between) these types of colleges just as it is not hard to make measurements among (and again even between) restaurants. This variety is far from a single chain franchise and the d</p>

<p>

I agree with the previous posters that this statement is naive. SL cares DEEPLY about their image. They can’t continue to be one of the most pricey schools if their perceived selectivity drops.</p>

<p>From a speech he gave to the Council of Independent Colleges published in the Chronicle of Higher Education.</p>

<p>

</p>

<p>

</p>

<p>It would help to define quality, but even a good definition is rather elusive. Quality engineers and six sigma types have given this a lot of thought. Here are some common definitions:

  • conformance to specifications
  • conformance to standards
  • adherence to customer requirements
  • customer satisfaction
  • meeting customer needs
  • fitness for use
  • the degree to which you perfectly meet the customers current and future needs
  • minimizing loss to society</p>

<p>One thing that needs to be made clear is the difference between quality and excellence. Under the definitions above, a Kia and a Mercedes can both be quality cars. The may have different specifications and customers, but each manufacturer is able to meet the specifications and satisfy their customers. With respect to excellence, the Mercedes has more demanding specifications and its customers have higher expectations. That results in it being an excellent car.</p>

<p>College accrediting organizations function well to identify quality institutions, i.e. those that meet the standards for certification, but they don’t say much about the excellence of the institutions they certify. How should institutional excellence be determined? Should it be judged on the knowledge and preparedness of graduating seniors? Or should it be judged on the value added to the graduating seniors during their college years, (grading on improvement)? Or perhaps some other measure? Keep in mind, the goal is to determine the excellence of the institution, and not the students it graduates. (The excellence of the graduates will be assessed separately, usually by grads schools and employers.)</p>

<p>

</p>

<p>What is an example of a subjective question that would provide anything but mush?<br>
I can’t think of one and NSSE’s sample forms don’t inspire confidence.</p>

<p>Opinion surveys generally are useful only if what you want to understand is opinion itself, e.g., the prestige associated with various colleges. If you want to understand the college per se, then any meaningful student opinion measurement would end up being computable from objective data that you can gather directly: class size, quality of life indices (food, housing, facilities, location), annual expenditure per undergrad.</p>

<p>Objective or numerical questions to students would reveal something as well, but there are all sorts of biases and problems with that. IMO anything detectable by a survey of thousands of students can be revealed by a few numerical data about the college. That is, a more refined version of the USNWR statistical questionnaire should, in principle, do the job.</p>

<p>Article in today’s California Aggie Online puts the number of universities that have signed up at 11. The article also includes an interesting comment by Robert Morse, director of data analysis for U.S.News college rankings, who takes the position that bigger universities - in contrast to smaller liberal arts schools - don’t think rankings are destructive because “they believe in accountability and assessment”.</p>

<p>

</p>

<p><a href=“http://media.www.californiaaggie.com/media/storage/paper981/news/2007/04/25/CityNews/Universities.Protest.College.Ranking.Systems-2879509-page2.shtml[/url]”>http://media.www.californiaaggie.com/media/storage/paper981/news/2007/04/25/CityNews/Universities.Protest.College.Ranking.Systems-2879509-page2.shtml&lt;/a&gt;&lt;/p&gt;

<p>I have a very limited knowledge of NSSE and don’t pretend to be an expert, but barrons shared some results from one survey and I saw that they measured the areas shown below. Certainly some of the questions are mush and the survey could be easily expanded or modified to gain more student perspectives on the quality of faculty. But there is some pretty good stuff in the Student Survey that, if I were trying to select a college, I would be particularly interested in. </p>

<p>Overall, the Student Survey below is far more specific, transparent and useful to me than the mysterious Peer Assessment system. Furthermore, I would strongly contend that at least some of the information that could be gleaned from the Student Assessment is as useful as likely PA considerations of how many medals a professor has won or how many articles a professor has written and published in an academic magazine. </p>

<p>OVERALL
(Students were asked to grade Excellent or Good or Fair or Poor)

  1. Student satisfaction with
    a. General academic quality of the school
    b. Overall positive educational experience
    c. Positive rating of the major program</p>

<p>(Students were asked to grade Definitely Yes or Probably Yes or Probably No or Definitely No)

  1. Would you attend the school again?</p>

<p>EDUCATIONAL & PERSONAL GROWTH
(Students were asked to grade Very Much or Quite a Bit or Some or Very Little)</p>

<p>How has your college experience contributed to:

  1. Thinking critically and analytically
  2. Acquiring a broad general education
  3. Learning effectively on your own
  4. Using computing technology
  5. Writing clearly and effectively
  6. Working effectively with others
  7. Analyzing qualitative problems
  8. Understanding yourself
  9. Acquiring work-related skills
  10. Speaking clearly and effectively
  11. Solving complex real world problems
  12. Developing a personal code of values
  13. Understanding other ethnic groups
  14. Contributing to my community
  15. Voting in local or national elections</p>

<p>CLASSROOM ACTIVITIES
(Students were asked to grade Very Often or Often or Sometimes or Never)</p>

<p>Have you:

  1. Worked on a project that integrated ideas
  2. Put together ideas from different courses
  3. Discussed ideas from classes with others outside of class
  4. Asked questions or contributed to discussions
  5. Worked with classmates outside of class on assignments
  6. Included diverse perspectives in discussions/assignments
  7. Prepared multiple drafts of a paper
  8. Made a class presentation
  9. Worked with other students during class
  10. Participated in community project as part of a course
    COURSEWORK EMPHASIS
    (Students were asked to grade from Quite Often to Often to Sometimes to Very Little)</p>

<p>Did the coursework:

  1. Analyze the basic elements of an idea/experience/theory
  2. Apply theories or concepts
  3. Synthesize or organize ideas into complex interpretations
  4. Make judgments about the value of the information
  5. Memorize facts, ideas, or methods</p>

<p>INTERACTION WITH FACULTY MEMBERS
(Students were asked to grade Very Often or Often or Sometimes or Never)
How common was this activity in your interaction with faculty members?

  1. Sent email to instructors
  2. Received prompt feedback on academic performance
  3. Discussed grades or assignments with instructor
  4. Worked harder than you thought to meet expectations
  5. Talked about career plans with faculty member or advisor
  6. Discussed idea from with faculty outside of class
  7. Worked with faculty members on activities outside of coursework</p>

<p>Students were asked to grade Done or Plan to Do or Do Not Plan to Do or Have Not Decided

  1. Have you worked with faculty on research beyond course/program requirements?</p>

<p>Students were asked to grade 7 or 6 or 5 or 4 or 1-3 (scale 1 = unhelpful and 7= very helpful)

  1. Did you find faculty members available, helpful, sympathetic, etc?</p>

<p>OUT OF CLASSROOM and ENRICHMENT ACTIVITIES
(Students were asked to grade Done Plan to Do or Do Not Plan to Do or Have Not Decided)
Did you participate in:

  1. Community Service or volunteer work?
  2. Practicum, internship, field, or co-op experience, clinical experience?
  3. Foreign language coursework?
  4. Senior experience-capstone course, thesis, etc/
  5. Work with faculty on research beyond course requirements?
  6. Independent study?
  7. Study abroad?</p>

<p>STUDENT TIME USE
(Students were asked to grade 20 hrs+ or 11-20 hrs or 6-10 hrs or 1-5 hrs)
How did you spend your time outside of class?

  1. Work for pay on campus
  2. Work for pay off campus
  3. Active in co-curricular activities
  4. Relaxing, socializing, exercising
  5. Prepare for classes</p>

<p>

</p>

<p>Well, the best way for the Aggies to diffuse the negative impact of the late or lacking report is to publish the entire report on the UC-Davis web site. While I doubt that many have a great interest in a “ranking” on fire safety, the opposite must be true for students who attend the university.</p>

<p>Now, if the school is late or does not participate in the survey AND also refrain to share the survey on a public forum, does it have the right to complain about being blackmailed? </p>

<p>For what it is worth, I still think that the ENTIRE peer assessment survey should be made public in a manner similar to the Common Data Set. Let’s see who punishes her friends and rewards her enemies with abandon and glee!</p>

<p>Nobody expects the schools to create information they do not have. However, schools that refuse to make available simple documents such as CDS are not helping their cause, especially the schools that cynically publish the documents but hide it behind passwords --read a few schools in Mass.</p>

<p>Most larger schools have a department that does nothing but institutional assessment and publishes similar data to that provided in condensed form in the CDS used by US News and most other college guides for many years. If they are doing lots of separate research to fill out the CDS they are not very well versed in the basic facts of their school.</p>

<p>

</p>

<p>Quite correct. </p>

<p><a href=“http://www.airweb.org/[/url]”>http://www.airweb.org/&lt;/a&gt;&lt;/p&gt;

<p>In contrast, some smaller schools, particularly as you get into 2nd-tier LACs have no institutional research office at all, and at most small institutions, it’s often a single person or at most two. They have to provide data for almost any issue the administration decides to study AND then provide free information to a host of commericial guidebooks and rankings. The link below gives links to the institutional research sites for the U. S. News top-50 LACs. There’s sometimes fascinating data available on the individual colleges IR sites. Well worth a click for any college on the “short list”:</p>

<p><a href=“http://www.bucknell.edu/x5202.xml[/url]”>http://www.bucknell.edu/x5202.xml&lt;/a&gt;&lt;/p&gt;

<p>Article in the Washington Square News of NYU: “Colleges to rethink rankings” reveals that eleven of the college presidents have signed the letter, and once 12 have signed up it will be sent to numerous universities, including NYU. According to NYU spokesman, John Beckman, NYU does not voluntarily participate in all of the ranking surveys, however, since information used to create a schools’ ranking is public information, NYU is included. </p>

<p>

</p>

<p><a href=“http://media.www.nyunews.com/media/storage/paper869/news/2007/04/26/News/Colleges.To.Rethink.Rankings-2882622-page2.shtml[/url]”>http://media.www.nyunews.com/media/storage/paper869/news/2007/04/26/News/Colleges.To.Rethink.Rankings-2882622-page2.shtml&lt;/a&gt;&lt;/p&gt;

<p>

</p>

<p>All you need is reading “according to the Princeton Review” to drop te comparison. The surveys of PR have as much validity and integrity as People magazine’s Sexiest Person on earth. Of course, integrity and Princeton Review are not to be used in the same sentence too often.</p>

<p>Xiggi, I don’t want to stick up for PR.</p>

<p>NYU does get more applications than any private school in the country. I find it amusing that people come up with all the complicated formulas to figure out where students want to go, but these same people don’t want to use number of applications. </p>

<p>NYU is pretty popular.</p>

<p>Call it the “Felicity” effect.</p>

<p>dstark,
Your point is well made and I agree that NYU is popular and received almost 35,000 applications last year (with 28.4% offered admission). It should also be noted that NYU’s enrollment is 20,566. The only other Top 50 USNWR private schools that have even 10,000 undergraduates are Cornell with 13,515and USC with 16,897. Most are about one-third or less the size of NYU.</p>

<p>Barrons, what can I say? I listed Felicity as one of my top ten favorite tv shows in the favorite tv show thread in the parents cafe. :)</p>