Can College Rankings Giant Keep Schools from Cheating?

Image

The steady trickle of colleges and universities that have admitted to fudging the figures that they send to U.S. News & World Report could end up changing the college rankings submission rules.

In an interview published today in Inside Higher Ed, Brian Kelly, the editor and chief content officer at U.S. News, said that the college rankings giant is considering requiring an administrator at each institution to verify that the figures dispatched to it are correct.

Schools that have acknowledged sending false figures to U.S. News in the past year include Claremont McKenna, George Washington University, Emory University, Bucknell University and the graduate program at Tulane School of Business. In my book, I mention other schools that have been embarrassed by their own rankings manipulation including the U.S. Naval Academy, Clemson University and Baylor University.

Clearly these aren’t the only schools that have fudged their statistics. In fact, in a 2012 survey of college administrators by Inside Higher Ed, 91 percent of respondents said they believed that other institutions were cheating.

Here is an excerpt from the article:

college rankings

It is hard to believe that Kelly doesn’t think  that college rankings hanky panky is happening far more frequently than is reported. The staff at US News (it’s no longer even a magazine) is tiny and a source of mine told me that the employees have seen plenty of questionable data that they have never investigated.

Ways to Manipulate the Numbers

Even if an administrator has to sign off on his or her school’s data, there are plenty of ways to game the system.

Baylor, for instance, was embarrassed when the student newspaper discovered that the university, which was fixated on boosting its rankings, asked a freshman class to retake the SAT  before school year started. A freshman who retook the SAT got a $300 credit to the bookstore and those who boosted their score by at least 50 points received a $1,000-year scholarship. An administrator at Clemson shared at a higher-ed conference that the university was manipulating its stats in a variety of ways including marking down other schools as below average.

In today’s article, Jerome A. Lucido, the executive director of the Center for Enrollment Research, Policy and Practice at the University of Southern California, noted that there are plenty of ways to game the system.

lucidoBottom Line:

What families should keep in mind is that college rankings are deeply flawed — without or without cheating. The best ways to use the rankings is simply to generate ideas.

Learn more about college rankings:

How College Rankings Can Hurt You

No. 1 Reason Why College Rankings are Lame

Blaming College Rankings for Runaway College Costs

Lynn O’Shaughnessy is the author of the second edition of The College Solution.

 

 


Join My Newsletter
Get your free guide to finding the most generous colleges
Practical, actionable information for Students, Parents, Counselors & Financial Advisors.

Let's Connect

Leave a Reply

  1. There needs to be a “ranking” that reflects what a school actually does for a student who decides to come.

    The most relevant number in U.S. News rankings is the “point-spread” between the actual and predicted graduation rates for the national universities as well as the national liberal arts schools. U.S. News bases the predicted graduation rate on the socioeconomic characteristics of the class that entered six years ago. When a school’s actual graduation rate dramatically exceeds the predicted rate (for example, Penn State is a +17. 70% was expected to graduate in six years, while 87% did, Rutgers-Newark is a +22, 46% predicted, 68% actual)

    The reason the point-spread is relevant is that it measures the influence that the college’s services have upon a class of students. While most schools in a guide like U.S. News have a well-educated faculty, the students make use of other services (academic advising, residence life, the libraries, career services) that are no less important in helping them succeed. When you see a +17 or a +22 you can imagine that a student gets considerable support.

    Unfortunately, the magazine only uses the point spread with national schools. That makes it more difficult for students to consider, for example, a very good regionally known public liberal arts school (College of NJ, SUNY-Geneseo, Truman State) versus a mid-state public university that might be more research-oriented.

    If I could develop a rating/ranking, it might cover not only the point spread, but also:
    + 4 year/6 year graduation rates
    —-Ideally, I’d want to know the graduation rates for student-athletes and students in economic opportunity programs, too. These are the major scholarship programs that have an advising component as well.
    + Freshman retention rate.
    + Percentage of students who participate in an internship, practicum, a research opportunity with a faculty member or an independent thesis. I would not lump study abroad into this, unless it was tied to one of those opportunities. It’s important to know how much of the student body may take advantage of an advanced or more personalized academic or pre-professional opportunity.

    1. Stuart, I love where you are going with this! I would also want to be able to quantify the quality of the learning opportunities in some sort of way that I could adequately judge the difference between a “national” school and a “regional” school. Any ideas on how that could be done?

      1. There’s really little to no difference between a “regional” and a “national” school other than the focus on doctoral education and research vs. teaching.

        Here’s one example, where there are two schools with the same number of undergraduates–and the schools are cross-shopped by applicants: James Madison University (VA) and the University of Delaware.

        + James Madison grants doctoral degrees in 10 subjects
        + Delaware grants doctoral degrees in more than 40

        + 12 percent of the classes at James Madison have more than 50 students
        + 15 percent of the classes at Delaware have more than 50 students–and there are more majors at this school.
        + Their student-faculty ratios are the same: 16 to 1

        James Madison is less likely to rely on teaching assistants to teach their larger classes; they don’t have as many doctoral candidates to help out. However, the professor at James Madison has less time to do research (and work with his best students) than the professor at Delaware because s/he has to do more teaching.

        It’s a toss-up.

        Now I add a smaller school, the University of Mary Washington (VA) that’s also cross-shopped against James Madison. Mary Washington has around 4,000 undergraduates, grants no doctoral degrees and has not had an honors program until this year. But it has lots of learning opportunities and more than two thirds of the freshmen finish on time. Mary Washington has a student-faculty ratio of 15 to 1 and around four percent of the classes have 50 or more students.

        At first impression, Mary Washington would provide a more “personal” experience than Delaware or James Madison. The student is more likely to have smaller classes and attention from full-time faculty. But student research opportunities in STEM (science, technology, mathematics and engineering) are less likely to come by. The smaller school is less likely to have the same access to laboratories and equipment that is available at the larger school as well as fewer grants to bring on students as research assistants.

        So it’s very hard to consider the “quality” of the opportunities the students may have available to them. If I were a social sciences major (history, poll sic), I’d say Mary Washington would be the best of the three schools in terms of access to opportunities; a student would get more attention and the facilities are fine to support these majors. If I were in a pre-professional or STEM major one of the other two schools might be better.

  2. Great information, Lynn. I love your last sentence, “The best ways to use the rankings is simply to generate ideas.” That’s what I figured out after spending hours and hours pouring over different rankings looking for candidates in my daughter’s college search and wondering what it all meant. It was driving me nuts. The US News rankings left me wondering things like 1) Is a school on the national ranking list automatically better than a school on a regional list?, 2) If my daughter goes to a school in the top 100 national universities or liberal arts colleges, will she automatically be better equipped to succeed?, 3) Why is the same school much higher on US News than on other lists?

    Parents and students need to stop stressing out about rankings and look at schools on a more personal level – what can this school do for me? A student doesn’t need a school high on the ranking lists to get a great education and get on the path to a great career. I know this is idealistic, but the more we can get people to realize that, the less incentive schools will have to cheat on their ranking data.

    1. HI Wendy,

      You pose excellent questions Wendy and I’m glad rhat you concluded that the best way to use the rankings is simply as a way to generate ideas. I think that is especially important if you are looking at schools in the liberal arts and regional school lists. These schools aren’t well known like the institutions that tend to be in the national university category.

      Lynn O’Shaughnessy