News
By Traci Hukill
In April 2005 the University of Denver's Sturm College of Law got a sour surprise. For no apparent reason, the school plummeted from 77th to 95th in U.S. News & World Report's closely watched annual ranking of law schools.
Anguished calls by Sturm officials to the magazine revealed the reason for the tumble: U.S. News & World Report had changed the way it factored in median LSAT scores. Even though the information accounted for just 12.5 percent of each school's overall score, the slight change in methodology was enough to send Sturm and a handful of other schools down in the rankings.
Also in the 2005 survey, 94th-ranked Santa Clara University School of Law slipped out of the top 100. "We were scratching our heads," says Santa Clara Senior Assistant Dean for Student Services Julia Yaffee. It turned out that confusion over how to report graduates' employment had led the school to submit a different set of data than usual.
This year the magazine reverted to its old formula for me-dian LSAT scores, and Sturm recovered to place 70th, while Santa Clara-with all confusion banished-climbed back up to number 87. Still, such incidents fuel a growing sense of injustice.
One point critics make is that the rankings tend to perpetuate themselves over time because the best and the brightest are naturally drawn to the top-ranked schools, which in turn continues to ensure their dominance. Critics also point to the arbitrary weighting of certain factors, such as reputation, which accounts for a whopping 40 percent of the total score. U.S. News & World Report determines reputation by asking academics, judges, and lawyers to rate schools that they may or may not know much about. Meanwhile, the magazine fails to factor in quality of instruction.
Nine years ago University of Texas law professor Brian Leiter decided to do something about all the complaining he heard from colleagues each spring by coming up with his own ranking system. In addition to reputation surveys, Leiter's system adds an objective measure-law journal citations-to help determine faculty quality. Leiter also measures student quality and job-placement rates. One thing he doesn't do, though, is combine all the factors to come up with an aggregate rating. "At one point, a few years ago, I tried aggregating and decided it didn't make a damn bit of sense," he confides.
Last spring, as one measure of Leiter's growing influence, his website, www.leiterrankings.com, received roughly 5,000 hits a week, most of which he presumed came from prospective law students.
In April 2005 the University of Denver's Sturm College of Law got a sour surprise. For no apparent reason, the school plummeted from 77th to 95th in U.S. News & World Report's closely watched annual ranking of law schools.
Anguished calls by Sturm officials to the magazine revealed the reason for the tumble: U.S. News & World Report had changed the way it factored in median LSAT scores. Even though the information accounted for just 12.5 percent of each school's overall score, the slight change in methodology was enough to send Sturm and a handful of other schools down in the rankings.
Also in the 2005 survey, 94th-ranked Santa Clara University School of Law slipped out of the top 100. "We were scratching our heads," says Santa Clara Senior Assistant Dean for Student Services Julia Yaffee. It turned out that confusion over how to report graduates' employment had led the school to submit a different set of data than usual.
This year the magazine reverted to its old formula for me-dian LSAT scores, and Sturm recovered to place 70th, while Santa Clara-with all confusion banished-climbed back up to number 87. Still, such incidents fuel a growing sense of injustice.
One point critics make is that the rankings tend to perpetuate themselves over time because the best and the brightest are naturally drawn to the top-ranked schools, which in turn continues to ensure their dominance. Critics also point to the arbitrary weighting of certain factors, such as reputation, which accounts for a whopping 40 percent of the total score. U.S. News & World Report determines reputation by asking academics, judges, and lawyers to rate schools that they may or may not know much about. Meanwhile, the magazine fails to factor in quality of instruction.
Nine years ago University of Texas law professor Brian Leiter decided to do something about all the complaining he heard from colleagues each spring by coming up with his own ranking system. In addition to reputation surveys, Leiter's system adds an objective measure-law journal citations-to help determine faculty quality. Leiter also measures student quality and job-placement rates. One thing he doesn't do, though, is combine all the factors to come up with an aggregate rating. "At one point, a few years ago, I tried aggregating and decided it didn't make a damn bit of sense," he confides.
Last spring, as one measure of Leiter's growing influence, his website, www.leiterrankings.com, received roughly 5,000 hits a week, most of which he presumed came from prospective law students.
#289525
Annie Gausn
Daily Journal Staff Writer
For reprint rights or to order a copy of your photo:
Email
jeremy@reprintpros.com
for prices.
Direct dial: 949-702-5390
Send a letter to the editor:
Email: letters@dailyjournal.com