I’ve already discussed the positive aspects of Above the Law (ATL)’s law school rankings. Here I address the poorly constructed parts of the ranking scheme. Once again, I use ATL to provoke further thought about all law school rankings.
Quality Jobs Score
ATL complements its overall employment score, which is one of the scheme’s positive features, with a “quality jobs score.” The latter counts only “placement with the country’s largest and best-paying law firms (using the National Law Journal’s “NLJ 250”) and the percentage of graduates embarking on federal judicial clerkships.”
I agree with ATL’s decision to give extra weight to some jobs; even among jobs requiring bar admission, some are more rewarding to graduates than others. This category, however, is unnecessarily narrow–and too slanted towards private practice.
Using ATL’s own justification for the category’s definition (counting careers that best support repayment of law school debt), it would be easy to make this a more useful category. Government and public interest jobs, which grant full loan forgiveness after ten years, also enable repayment of student loans. Given the short tenure of many BigLaw associates, the government/public interest route may be more reliable than the BigLaw one.
I would expand this category to include all government and public interest jobs that qualify graduates for loan forgiveness at the ten-year mark, excluding only those that are school financed. Although ATL properly excludes JD-advantage jobs from its general employment score, I would include them here–as long as the jobs qualify for public-service loan forgiveness. A government job requiring bar admission, in other words, would count toward both employment measures, while a JD-advantage government position would count just once.
Making this change would reduce this factor’s bias toward private practice, while incorporating information that matters to a wider range of prospective students.
SCOTUS Clerks and Federal Judges
Many observers have criticized this component, which counts “a school’s graduates as a percentage of (1) all U.S. Supreme Court clerks (since 2010) and (2) currently sitting Article III judges.” For both of these, ATL adjusts the score for the size of the school. What’s up with that?
ATL defends the criterion as useful for students “who want to be [federal] judges and academics.” But that’s just silly. These jobs constitute such a small slice of the job market that they shouldn’t appear in a ranking designed to be useful for a large group of users. If ATL really embraces the latter goal, there’s an appropriate way to modify this factor.
First, get rid of the SCOTUS clerk count. That specialized information is available elsewhere (including on ATL) for prospective students who think that’s relevant to their choice of law school. Second, expand the count of sitting Article III judges to include counts of (a) current members of Congress; (b) the President and Cabinet members; and (c) CEO’s and General Counsel at all Fortune 500 companies. Finally, don’t adjust the counts for school size.
These changes would produce a measure of national influence in four key areas: the judiciary, executive branch, legislature, and corporate world. Only a small percentage of graduates will ever hold these very prestigious jobs, but the jobholders improve their school’s standing and influence. That’s why I wouldn’t adjust the counts for school size. If you’re measuring the power that a school exerts through alumni in these positions, the absolute number matters more than the percentage.
Leaders in private law firms, state governments, and public interest organizations also enhance a school’s alumni network–and one could imagine adding those to this component. Those organizations, however, already receive recognition in the two factors that measure immediate graduate employment. It seems more important to add legislative, executive, and corporate influence to the rankings. As a first step, therefore, I would try to modify this component as I’ve outlined here.
Component Sorting
A major flaw in ATL’s scheme is that it doesn’t allow users to sort schools by component scores. The editors have published the top five schools in most categories, but that falls far short of full sorting. Focused-purpose rankings are most useful if readers can sort schools based on each component. One reader may value alumni ratings above all other factors, while another reader cares about quality jobs. Adding a full-sort feature to the ranking would be an important step.
Why Rank?
Like many educators, I dislike rankings. The negative incentives created by US News far outweigh the limited value it offers prospective students. Rankings can also mislead students into making decisions based solely on those schemes, rather than using rank as one tool in a broader decisionmaking process. Even if modified in the ways I suggest here, both of these drawbacks may affect the ATL rankings.
As Law School Transparency has shown, it is possible to give prospective students useful information about law schools without adding the baggage of rankings. Above the Law could perform a greater public service by publishing its data as an information set rather than as an integrated ranking.
But rankings draw attention and generate revenue; they are unlikely to disappear. If we’re going to have rankings, then it’s good to have more than one. Comparing schemes may help us see the flaws in all ranking systems; perhaps eventually we’ll reject rankings in favor of other ways to organize information.
Cafe Manager & Co-Moderator
Deborah J. Merritt
Cafe Designer & Co-Moderator
Kyle McEntee
Law School Cafe is a resource for anyone interested in changes in legal education and the legal profession.
Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.