On Friday, the ABA Section of Legal Education considers a recommendation from the section’s data policy committee about when schools collect graduate employment data. Instead of collecting data nine months after graduation, schools would collect data ten months after graduation.
The change looks minor, but it’s misguided. Though the council should dismiss the recommendation outright for reasons outlined below, the council needs to at least decline to act on the recommendation this week.
The committee’s reasoning is straightforward: some graduates don’t obtain jobs by the nine-month mark because some state bars have a slow licensing process. As committee chair Len Strickman puts it in the committee’s recommendation memo, the data policy change would have “the benefit of a more level playing field.”
Several New York and California deans have lobbied for the policy change because those jurisdictions release July bar results so late. Last year, California provided results on November 16th, with swearing-in ceremonies in the following weeks. New York provided results earlier, on November 1st, but many struggled to be sworn in for months.
A variety of employers, such as small firms and the state government, tend to hire licensed graduates. Compared to schools in states with a quicker credentialing process, New York and California schools are disadvantaged on current employment metrics. Changing the measurement date to mid-March instead of mid-February would allegedly take some bite out the advantage.
To check for a quantifiable advantage, the data policy committee considered two sets of data. First, the committee sorted schools by the percentage of 2012 graduates working professional jobs (lawyers or otherwise) as of February 15, 2013. Second, the committee sorted schools by the percentage of 2012 graduates who were unemployed or had an unknown employment status. For both measures, the committee determined that New York and California schools were disproportionally represented on the bad end of the curve.
Professor Strickman notes in his committee memo that many of the poorly-performing schools are “are broadly considered to be highly competitive schools nationally.” I’m not sure exactly what this means, but it sounds a lot like confirmation bias. Is he suggesting that the employment outcomes don’t match U.S. News rankings? The committee’s collective impression of how relatively well the schools should perform? Faculty reputation? It’s a mystery and without further support, not at all compelling.
Professor Strickman acknowledges that other factors may explain the relative placement. He does not name or address them. Here are some factors that may explain the so-called disadvantage:
(1) Graduate surplus (not just 2012, but for years);
(2) Attractiveness of certain states to graduates from out-of-state schools;
(3) Overall health of local legal markets;
(4) Graduate desirability;
(5) Ability of schools to fund post-graduation jobs.
Neither do we even know whether the rule revision would level the playing field. In other words, one extra month may not capture more professional job outcomes for graduates of New York and California schools than graduates of other schools. More time, after all, ought to produce better results for all schools with high under- and unemployment.
In sum, the committee should have declined to recommend the ten-month proposal until its proponents meet their burden of persuasion. The problem has not been well articulated, and the data do not support the conclusion.
Worse than recommending an unsupported policy change, the committee ignores the group for whom law schools produce job statistics: prospective students. Prospective students, students, and a society that depends on lawyers are the Section of Legal Education’s constituents. Calling the uneven playing field a “disadvantage,” “penalty,” and “hardship” for law schools shows from where the committee obtained its perspective.
(1) Is there a normative problem with an uneven playing field?
It’s not apparent that there’s an issue to resolve. Grant the committee its premise that state credentialing timelines affect performance on employment metrics. Is it the ABA’s job to ensure that schools compete with each other on a level playing field?
In one sense, yes, of course. When a school lies or cheats or deceives it gains an undeserved advantage and ABA Standard 509 prohibits this behavior. But it does not prohibit that behavior because of how it affects school-on-school competition. Prohibitions are a consequence of the ABA’s role in protecting consumers and the public.
The ABA was ahead of the curve when it adopted Standard 509 in the 1990’s. The organization interpreted its accreditation role to include communicating non-educational value to these constituents through employment information.
Here, the ABA failed to adequately consider the prospective students who want to make informed decisions, and the public which subsidizes legal education.
Prospective students received only a passing mention in Professor Strickman’s memo. In describing why the committee rejected several schools’ request to move the measurement back to one year, Professor Strickman’s explains:
The Data Policy and Collection Committee decided to reject this request because that length of delay would undermine the currency of data available to prospective law students.
As it happens, the committee’s chosen proposal also has a currency problem. The committee also failed to convey whether or how, if at all, it considered the change’s impact on the value of the consumer information.
(2) Does the new policy impede a prospective student’s ability to make informed decisions?
One of the ABA’s recent accomplishments was accelerating the publication of employment data. Previously, the ABA published new employment data 16 months after schools measured employment outcomes. In 2013, the ABA took only six weeks.
But if the Section of Legal Education adopts the ten-month proposal, it pushes data publication to the end of April—after many deposit deadlines and on the eve of others. While applicants should not overrate the importance of year-to-year differences, they should have the opportunity to evaluate the changes.
The new policy also makes the information less useful.
At one time, schools reported graduate employment outcomes as of six months after graduation. In 1996, NALP began measuring outcomes at nine months instead. The ABA, which at that time only asked schools to report their NALP employment rate, followed.
The six-month measurement makes far more sense than the nine-month date. Six months after graduating, interest accumulated during school capitalizes and the first loan payment is due. Ideally that six-month period would be used to pay down the accumulated interest so that less interest is paid later. The credentialing process makes this a rarity. Adding another month to the measurement makes the figure even less valuable.
Reducing comparability also dilutes the value of recent employment information. Students should not consider one year of data in isolation, but should analyze changes and the reasons for those changes. It’s for this reason that the ABA requires schools to publish three years of employment data as of last August.
The council needs to add additional viewpoints to the data policy committee. Right now, the committee is dominated by law school faculty and administrators. All twelve members are current faculty, deans, or other administrators. The name change from the “Questionnaire Committee” to the “Data Policy and Collection Committee” envisions a policy role for the group.
Just like the council, standards committee, and accreditation committee need a diversity of viewpoints, so too does the data policy committee. Perhaps if this diversity existed on the committee to begin with the new measurement date would not have been recommended too soon or at all.
As the council considers whose interests it serves and whether the data policy recommendation is ripe for adoption, I hope its members also consider the drivers of the policy beyond a law school lobby promoting its own interests.
The policy presupposes a reality where there are enough graduates who cannot derive economic value from their law degrees nine months after graduating that the ABA needs to modify its collection policy in order to count them.
Let me repeat that. It takes so long to become a lawyer that almost a year can pass and it’s reasonable to think many people are not yet using a credential they invested over three years of time, money, and effort to receive. A career is (hopefully) decades long, but the brutal reality of credentialing is that its costs go beyond what any fair system would contemplate. A change to the data policy as a solution would be funny were the economics of legal education not so tragic.
Cafe Manager & Co-Moderator
Deborah J. Merritt
Cafe Designer & Co-Moderator
Kyle McEntee
Law School Cafe is a resource for anyone interested in changes in legal education and the legal profession.
Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.