From time to time, I like to read real books instead of electronic ones. During a recent ramble through my law school’s library, I stumbled across an intriguing set of volumes: NALP employment reports from the late nineteen seventies. These books are so old that they still have those funny cards in the back. It was the content, though, that really took my breath away. During the 1970s, NALP manipulated data about law school career outcomes in a way that makes more contemporary methods look tame. Before I get to that, let me give you the background.
NALP compiled its first employment report for the Class of 1974. The data collection was fairly rudimentary. The association asked all ABA-accredited schools to submit basic data about their graduates, including the total number of class members, the number employed, and the number known to be still seeking work. This generated some pretty patchy statistics. Only 83 schools (out of about 156) participated in the original survey. Those schools graduated 17,188 JDs, but they reported employment data for just 13,250. More than a fifth of the graduates (22.9%) from this self-selected group of schools failed to share their employment status with the schools.
NALP’s early publications made no attempt to analyze this selection bias; the reports I’ve examined (for the Classes of 1977 and 1978) don’t even mention the possibility that graduates who neglect to report their employment status might differ from those who provide that information. The reports address the representativeness of participating schools, but in a comical manner. The reports divide the schools by institutional type (e.g., public or private) and geographic region, then present a cross-tabulation showing the number and percentage of schools participating in each category. For the Class of 1977, participation rates varied from 62.5% to 100%, but the report gleefully declares: “You will note the consistently high percentage of each type of institution, as well as the large number of schools sampled. I believe we can safely say that our study is, in fact, representative!” (p. 7)
Anyone with an elementary grasp of statistics knows that’s nonsense. The question isn’t whether the percentages were “high,” it’s how they varied across categories. Ironically, at the very time that NALP published the quoted language, I was taking a first-year elective on “Law and Social Science” at my law school. It’s galling that law schools weren’t practicing the quantitative basics that they were already teaching.
NALP quickly secured more participating schools, which mooted this particular example of bad statistics. By 1978, NALP was obtaining responses from 150 of the 167 ABA-approved law schools. Higher levels of school participation, however, did not solve the problem of missing graduates. For the Classes of 1974 through 1978, NALP was missing data on 19.4% to 23.7% of the graduates from reporting schools. Blithely ignoring those graduates, NALP calculated the employment rate each year simply by dividing the number of graduates who held any type of job by the number whose employment status was known. This misleading method, which NALP still uses today, yielded an impressive employment rate of 88.1% for the Class of 1974.
But even that wasn’t enough. Starting with the Class of 1975, NALP devised a truly ingenious way to raise employment rates: It excluded from its calculation any graduate who had secured neither a job nor bar admission by the spring following graduation. As NALP explained in the introduction to its 1977 report: “The employment market for new attorneys does not consist of all those that have graduated from ABA-approved law schools. In order for a person to practice law, there is a basic requirement of taking and passing a state bar examination. Those who do not take or do not pass the bar examination should therefore be excluded from the employment market….” (p. 1)
That would make sense if NALP had been measuring the percentage of bar-qualified graduates who obtained jobs. But here’s the kicker: At the same time that NALP excluded unemployed bar no-admits from its calculation, it continued to include employed ones. Many graduates in the latter category held jobs that we call “JD Advantage” ones today. NALP’s 1975 decision gave law schools credit for all graduates who found jobs that didn’t require a law license, while allowing them to disown (for reporting purposes) graduates who didn’t obtain a license and remained jobless.
I can’t think of a justification for that–other than raising the overall employment rate. Measure employment among all graduates, or measure it among all grads who have been admitted to the bar. You can’t use one criterion for employed graduates and a different one for unemployed graduates. Yet the “NALP Research Committee, upon consultation with executive committee members and many placement directors from throughout the country” endorsed this double standard. (id.)
And the trick worked. By counting graduates who didn’t pass the bar but nonetheless secured employment, while excluding those who didn’t take the bar and failed to get jobs, NALP produced a steady rise in JD employment rates: 88.1% in 1974 (under the original method), 91.6% in 1975, 92.5% in 1976, 93.6% in 1977, and a remarkable 94.2% in 1978. That 94.2% statistic ignored 19.5% of graduates who didn’t report any employment status, plus another 3.7% who hadn’t been admitted to the bar and were known to be unemployed but, whatever.
NALP was very pleased with its innovation. The report for the Class of 1977 states: “This revised and more realistic picture of the employment market for newly graduated and qualified lawyers reveals that instead of facing unemployment, the prospects for employment within the first year of graduation are in fact better than before. Study of the profile also reveals that there has been an incremental increase in the number of graduates employed and a corresponding drop in unemployment during that same period.” (p. 21) Yup, unemployment rates will fall if you ignore those pesky graduates who neither found jobs nor got admitted to the bar–while continuing to count all of the JD Advantage jobs.
I don’t know when NALP abandoned this piece of data chicanery. My library didn’t order any of the NALP reports between 1979 and 1995, so I can’t trace the evolution of NALP’s reporting method. By 1996, NALP was no longer counting unlicensed grads with jobs while ignoring those without jobs. Someone helped them come to their senses.
Why bring this up now? In part, I’m startled by the sheer audacity of this data manipulation. Equally important, I think it’s essential for law schools to recognize our long history of distorting data about employment outcomes. During the early years of these reports, NALP didn’t even have a technical staff: these reports were written and vetted by placement directors from law schools. It’s a sorry history.
At Washington & Lee, as at most schools right now, we would prefer that our students were more successful in obtaining employment. But the 2012 employment figures, unfortunate as they are, say nothing about our our curricular reform. It is simply too early, . . . much too early.
The 2012 numbers refer to the first full class to pass through the reformed third year curriculum. Ours is a slow-to-change profession. Employers as a group do not change their settled practices on a dime. Nothing in the employment numbers that we see for the next 3 to five years should be seen as reflecting on the reception given to the curriculum reform. No curricular reform I know of, including Langdell’s, changed settled practices of others overnight. (more…)
NALP, the National Association for Law Placement, has released selected findings about employment for the Class of 2012. The findings and accompanying press release don’t tell us much more than the ABA data published in late March, but there are a few interesting nuggets. Here are my top ten take-aways from the NALP data.
1. Law school leads to unemployment. I’m sorry to put that so bluntly, but it’s true. Even after adopting a very generous definition of employment–one that includes any work for pay, whatever the nature of the work, the number of hours worked per week, and the permanence (or lack thereof) of the position–only 84.7% of graduates from ABA-accredited schools were employed nine months after graduation. Almost one in six graduates had no job at all nine months after graduation? That statistic is beyond embarrassing.
Some of those graduates were enrolled in other degree programs, and some reported that they were not seeking work. Neither of those categories, however, should offer much comfort to law schools or prospective students. It’s true that yet another degree (say an LLM in tax or an MBA) may lead to employment, but those degrees add still more time and money to a student’s JD investment. Graduates who are unemployed and not seeking work, meanwhile, often are studying for the February bar exam–sometimes after failing on their first attempt. Again, this is not a comforting prospect for students considering law school.
Even if we exclude both of those categories, moreover, 10.7% of 2012 graduates–more than one in every ten–was completely unemployed and actively seeking work in February 2013. The national unemployment rate that month was just 7.7%. Even among 25-to-29-year olds, a group that faces higher than average unemployment, the most recent reported unemployment rate (for 2012) was 8.9%. Recent graduates of ABA-accredited law schools are more likely to be unemployed than other workers their age–most of whom have far less education.
2. Nine months is a long time. When responding to these dismal nine-month statistics, law schools encourage graduates to consider the long term. Humans, however, have this annoying need to eat, stay warm, and obtain health care in the present. Most of us would be pretty unhappy if we were laid off and it took more than nine months to find another job. How would we buy food, pay our rent, and purchase prescriptions during those months? For new graduates it’s even worse. They don’t have the savings that more senior workers may have as a cushion for unemployment; nor can they draw unemployment compensation. On the contrary, they need to start repaying their hefty law school loans six months after graduation.
When we read nine-month statistics, we should bear those facts in mind. Sure, the unemployed graduates may eventually find work. But most of them already withdrew from the workforce for three years of law school; borrowed heavily to fund those years; borrowed still more to support three months of bar study; sustained themselves (somehow) for another six months; and have been hearing from their loan repayment companies for three months. If ten percent are still unemployed and seeking work the February after graduation, what are they living on?
3. If you want to practice law, the outlook is even worse. Buried in the NALP releases, you’ll discover that only 58.3% of graduates secured a full-time job that required bar admission and would last at least a year. Even that estimate is a little high because NALP excludes from its calculation over 1000 graduates whose employment status was unknown. Three years of law school, three months of bar study, six months of job hunting, and more than two out of every five law graduates still has not found steady, full-time legal work. If you think those two wanted JD Advantage jobs, read on.
4. Many of the jobs are stopgap employment. Almost a quarter of 2012 graduates with jobs in February 2013 were actively looking for other work. The percentage of dissatisfied workers was particularly high among those with JD Advantage positions: forty-three percent of them were seeking another job. JD Advantage positions offer attractive career options for some graduates, but for many they are simply a way to pay the bills while continuing the hunt for a legal job.
5. NALP won’t tell you want you want to know. When the ABA reported similar employment statistics in March, it led with the information that most readers want to know: “Law schools reported that 56.2 percent of graduates of the class of 2012 were employed in long-term, full-time positions where bar passage was required.” The ABA’s press release followed up with the percentage of graduates in long-term, full-time JD Advantage positions (9.5%) and offered comparisons to 2011 for both figures. Bottom line: Nine months after graduation, about two-thirds of 2012 graduates had full-time, steady employment related to their JD.
You won’t find that key information in either of the two reports that NALP released today. You can dig out the first of those statistics (the percentage of the class holding full-time, long-term jobs that required bar admission), but it’s buried at the bottom of the second page of the Selected Findings. You won’t find the second statistic (the percentage of full-time, long-term JD Advantage jobs) anywhere; NALP reports only a more general percentage (including temporary and part-time jobs) for that category.
NALP’s Executive Director, James Leipold, laments disclosing even that much. He tells us that the percentage of full-time, long-term jobs requiring bar passage “is certainly not a fair measure of the value of a legal education or the return on investment, or even a fair measure of the success of a particular graduating class in the marketplace.” Apparently forgetting the ABA’s attention to this employment measure, Leipold dismisses it as “the focus of so much of the media scrutiny of legal education.”
What number does NALP feature instead? That overall employment rate of 84.7%, which includes non-professional jobs, part-time jobs, and temporary jobs. Apparently those jobs are a more “fair measure of the value of a legal education.”
6. Law students are subsidizing government and nonprofits. NALP observes that the percentage of government and public interest jobs “has remained relatively stable for more than 30 years, at 26-29%.” At the same time, it reports that most law-school-funded jobs lie in this sector. If the percentage of jobs has remained stable, and law schools are now funding some of those spots, then law schools are subsidizing the government and public interest work. “Law schools,” of course, means students who pay tuition to those schools. Even if schools support post-graduate fellowships with donor money, those contributions could have been used to defray tuition costs.
I’m all in favor of public service, but shouldn’t the taxpayers and charitable donors pay for that work? In the current scheme, law students are borrowing significant sums from the government, at high interest rates, so that they can pay tuition that is used to subsidize government and nonprofit employees. Call me old fashioned, but that seems like a complicated (and regressive) way to pay for needed services. Why not raise taxes on people like me, who actually earn money, rather than issue more loans to people who hope someday to earn money?
7. Don’t pay much attention to NALP’s salary figures. NALP reports some salary information, which the ABA eschews. Those tantalizing figures draw some readers to the NALP report–and hype the full $95 version it will release in August. But the salary numbers are more misleading than useful. NALP reports salary information only for graduates who hold full-time, long-term positions and who report their salaries. That’s a minority of law graduates: Last year NALP reported salaries for just 18,639 graduates, from a total class of 44,495. Reported salaries, therefore, represented just 41.9% of the class. The percentage this year is comparable.
That group, furthermore, disproportionately represents the highest salaries. As NALP itself recognizes, salaries are “disproportionately reported for those graduates working at large firms,” so median salaries are “biased upward.” Swallow any salary reports, in other words, with a tablespoon of salt.
8. After accounting for inflation, today’s reported salaries are lower than ones from the last century. Although NALP’s reported salaries skew high, they offer some guidance to salary trends over time. Unfortunately, those trends are negative. During the early nineteen nineties, the country was in recession and law firms hadn’t yet accelerated pay for new associates. The median reported salary for 1991 graduates was just $40,000. Accounting for inflation, that’s equivalent to a 2012 median salary of $67,428. The actual reported median for that class, however, was just $61,245. Even when today’s graduates land a full-time, steady job, they’re earning 9.2% less than graduates from the last century.
9. The lights of BigLaw continue to dim. NALP acknowledges the “‘new normal’ in which large firm hiring has recovered some but remains far below pre-recession highs.” The largest firms, those with more than 500 lawyers, hired more than 3,600 members of the Class of 2012, a total that modestly exceeded the number hired from the Class of 2011. Current employment, however, remains well shy of the 5,100 graduates hired from the Class of 2009. Meanwhile, a growing percentage of those BigLaw hires are staff attorneys rather than associates. These lower-status, lower-paid lawyers currently comprise 4% of new BigLaw hires, and they are “more common than just two years ago.”
Inflation, meanwhile, has eroded salaries for even the best paid associates in BigLaw. In 2000, NALP reported a median salary of $125,000 for graduates joining firms that employed more than 500 lawyers. Adjusting for inflation, that would be $166,662 for the Class of 2012. BigLaw associates won’t starve on the median $160,000 they’re actually earning, but they’re taking home less in real dollars than the associates who started at the turn of the century.
For associates joining the second tier of BigLaw, firms that employ 251-500 lawyers, the salary news is even worse. In 2000, those associates also reported a median salary of $125,000, which would translate to $166,662 today. The actual median, however, appears to be $145,000 (the same figure reported for 2011). That’s a decline of 13% in real dollars.
10. It goes almost without saying that these 2012 graduates paid much more for their law school education than students did in 1991, 2000, or almost any other year. Law school tuition has far outpaced inflation over the last three decades. It’s scant comfort to this class–or to the classes of 2010, 2011, or 2013–that heavy discounts are starting to ease tuition. These are classes that bought very high and are selling very low. There’s little that law schools can do to make the difference up to these graduates, but we shouldn’t forget the financial hardship they face. If nothing else, the tuition-jobs gap for these classes should make us commit to the boldest possible reforms of legal education.
Pedagogically and professionally, it makes sense for law schools to teach practical skills along with theory and doctrine. New lawyers should know how to interview clients, file simple legal documents, and analyze real-world problems, just as new doctors should know how to interview patients, use a stethoscope, and offer a diagnosis. Hands-on work can also deepen knowledge received in the classroom. Law students who apply classroom theories to real or simulated clients develop stronger intellectual skills, as well as new practical ones.
Employers say they are eager to hire these better-trained, more rounded, more “practice ready” lawyers–and they should be. That’s why the employment results for Washington & Lee’s School of Law are so troubling. Washington & Lee pioneered an experiential third-year program that has won accolades from many observers. Bill Henderson called Washington & Lee’s program the “biggest legal education story of 2013.” The National Jurist named the school’s faculty as among the twenty-five most influential people in legal education. Surely graduates of this widely praised program are reaping success in the job market?
Sadly, the statistics say otherwise. Washington & Lee’s recent employment outcomes are worse than those of similarly ranked schools. The results are troubling for advocates of experiential learning. They should also force employers to reflect on their own behavior: Does the rhetoric of “practice ready” graduates align with the reality of legal hiring? Let’s look at what’s happening with Washington & Lee graduates.
Employment Outcomes
I used the law-job calculator developed by Educating Tomorrow’s Lawyers to compare Washington & Lee’s employment outcomes with those of other schools. Drawing upon ABA data that reports job outcomes nine months after graduation, the calculator allows users to choose their own formulas for measuring outcomes. I chose two formulas that I believe resonate with many observers:
(a) The number of full-time, long-term jobs requiring bar admission, minus (i) any of those jobs funded by the law school and (ii) any solo positions; all divided by the total number of graduates.
(b) The number of full-time, long-term jobs requiring bar admission or for which the JD provided an advantage, minus (i) any of those jobs funded by the law school and (ii) any solo positions; all divided by the total number of graduates.
[Note: These are not the only formulas for measuring job outcomes; other formulas may be appropriate in other contexts. These formulas work here because they allow the most straightforward comparison of employment outcomes across schools. These formulas also make the best case for Washington & Lee’s outcomes, because that school did not report any long-term, full-time solos or school-funded jobs in 2011 or 2012.]
Using those two measures, Washington and Lee’s employment outcomes for 2011 were noticeably mediocre. By nine months after graduation, only 55.0% of the school’s graduates had obtained full-time, long-term jobs that required bar admission. That percentage placed Washington & Lee 76th among ABA-accredited schools for job outcomes. Using the second, broader metric, 64.3% of Washington & Lee’s class secured full-time, long-term positions. But that only nudged the school up a few spots compared to other schools–to 73rd place.
In 2012, the numbers were even worse. Only 49.2% of Washington & Lee’s 2012 graduates obtained full-time, long-term jobs that required a law license, ranking the school 119th compared to other accredited schools. Including JD Advantage jobs raised the percentage to 57.7%, but lowered Washington & Lee’s comparative rank to 127th.
These numbers are depressing by any measure; they are startling when we remember that Washington & Lee currently is tied for twenty-sixth place in the US News ranking. Other schools of similar rank fare much better on employment outcomes.
The University of Iowa, for example, holds the same US News rank as Washington & Lee and suffers from a similarly rural location. Yet Iowa placed 70.8% of its 2012 graduates in full-time, long-term jobs requiring bar admission–more than twenty percentage points better than Washington & Lee. The College of William & Mary ranks a bit below Washington & Lee in US News (at 33rd) and operates in the same state. After excluding solos and school-funded positions (as my formula requires), William & Mary placed 55.9% of its 2012 graduates in full-time, long-term jobs requiring bar admission–significantly better than Washington & Lee’s results.
What’s the Explanation?
Law school employment outcomes vary substantially. Geography, school size, and local competition all seem to play a role. But Washington & Lee’s outcomes are puzzling given both the prominence of its third-year program and the stridency of practitioner calls for more practical training. Just last week, California’s Task Force on Admissions Regulation Reform suggested: “If, in the future, new lawyers come into the profession more practice-ready than they are today, more jobs will be available and new lawyers will be better equipped to compete for those jobs.” (p. 14) If that’s true, why isn’t the formula working for Washington & Lee?
I think we need to explore at least four possibilities. First and most important, the connection between practical training and jobs is much smaller than practitioners and bar associations assert. Employers like practice-ready graduates because those new lawyers are cheaper to train; an employer thus might be more likely to hire a practice-ready graduate than a clueless one. Most of those hiring decisions, however, involve choosing among applicants, not creating new positions. A few employers might hire a practice-ready graduate when they wouldn’t have otherwise hired any lawyer, but those job-market gains are likely to be small.
Practice-readiness can even reduce the number of available jobs. If a practice-ready lawyer handles more work than a less-experienced one, her employer may need fewer entry-level lawyers. Even the best-trained new lawyer is unlikely to grow the client base immediately. The number of legal jobs depends much more on client demand and employer entrepreneurship than on the experience that new graduates possess. Maybe the employers recruiting at Washington & Lee have recognized that truth.
Second, even when allocating existing jobs, employers may care less about practical training than they claim. Law school clinicians have noted for years that legal employers rarely demand “clinical experience” as a prerequisite for on-campus interviews. Instead, their campus interviewing forms are more likely to list “top ten percent” or “law review.” Old habits die hard. Employers have maintained for the last few years that “this time we really mean it when we ask for practical skills,” but maybe they don’t.
Third, employers may care about experience, but want to see that experience in the area for which they’re hiring. This possibility is particularly troubling for law schools that are trying to expand clinical and other client-centered offerings. As a professor who teaches both a criminal defense clinic and a prosecution one, I can see the ways in which these experiences apply to other practice areas. A student who learns to discern the client’s individual needs, as our defense lawyers do, can transport that lesson to any practice area. A student who weighs competing interests in deciding whether to prosecute can apply similar skills for any employer.
Unfortunately, however, I don’t think employers always share my impression. Over the years, I’ve had the sense that students from the criminal defense clinic are stereotyped as public defenders, do-gooders, or (worse) anti-establishment radicals–even if they took the clinic for the client counseling, negotiation, and representation experience. Prosecution students don’t encounter the same negative images, but they sometimes have trouble persuading law firms and corporations that they’re serious about practicing corporate law.
No matter how many clinics and simulations a law school offers–and Washington & Lee provides a lot–each student can only schedule a few of these experiences. If a student chooses experiential work in entertainment law and intellectual property, does the student diminish her prospects of finding work in banking or family law? Does working in the Black Lung Legal Clinic create a black mark against a student applying to work later for corporate clients?
I wonder, in other words, if the menu of clinical choices we offer students actually operates against them. Would it be better to cycle all students through a series of required clinical experiences? That’s the way that medical school rotations work. Under that system, would employers better understand that all clinical experience has value for a new lawyer? Would they be less likely to lock graduates into particular career paths based on the clinical experiences they chose? These are questions we need to pursue as we expand experiential education in law schools.
A fourth possible explanation for Washington & Lee’s disappointing employment outcomes is that the students themselves may have developed higher or more specialized career ambitions than their peers at other schools. Some students may have been so excited by their clinical work that they were unwilling to accept jobs in other areas. Others, buoyed by employers’ enthusiasm for practice-ready graduates, may have held out for the most attractive positions on the market. If this explanation has power, then Washington & Lee’s graduates may fare better as more months pass. Maybe practice-ready graduates get better jobs, and perform better for their employers, but the matches take longer to make.
What Do We Learn?
What lessons should we take from Washington & Lee’s 2011 and 2012 employment outcomes? First, the school still deserves substantial credit for its willingness to innovate–as well as for the particular program it chose. If law school remains a three-year, graduate program, then experiential work should occupy a larger segment of the curriculum than it has at most schools in the past. That makes pedagogic sense and, even if experiential learning doesn’t expand the job market, it should produce more thoughtful, well rounded attorneys.
Second, legal employers should take a hard look at the factors they actually value in hiring. What role does clinical experience really play? Do grades and law review membership still count more? Are employers discounting clinical work done outside their practice area? Are they even holding that work against a candidate? Law schools are engaging in significant introspection about the education they provide; it is time for employers to critically examine their own actions and hiring assumptions.
Third, law schools and employers should work together to design the best type of experiential education–one that prepares graduates for immediate employment as well as long-term success. If employers value a 4-credit externship with their own organization more than 12 credits of clinical work in a different area, we need to grapple with that fact. Schools might decide not to accommodate that desire; we might worry that externships are too narrow (or too exploitative of students) and encourage employers to value other clinical training more highly. On the other hand, we might agree that the best experiential education relates directly to a student’s post-graduate job. Unless we work together, we won’t figure out either the hurdles or the solutions.
Washington & Lee’s employment outcomes are a puzzle that we all need to confront. Graduates from most law schools, even high-ranking ones, are struggling to find good jobs. Experiential education can work pedagogic magic and prepare better lawyers, but it’s not a silver bullet for employment woes or heavy debt. On those two issues, we need to push much harder for remedies.
John Colombo has posted a useful paper examining the best organizational form for postgraduate law firms created by law schools. Several law schools are exploring that type of firm; we discussed the general idea in several earlier posts. Professor Colombo probes the important tax consequences of organizing these entities, an issue that no school would want to ignore.
Colombo’s analysis suggests, first, that a firm operating as a division of a law school would not endanger the school’s tax-exempt status. Even if the firm charged clients for representation, paid graduates employed by the firm, and generated net revenue, the firm would not negate the school’s tax-exempt status as long as its activities remained “functionally related to the educational mission of the underlying school.” Colombo offers more detail on meeting that and related IRS tests, but concludes that postgraduate firms should readily pass muster.
Some schools, however, might prefer to establish a law firm as a separate non-profit entity. In particular, schools (and their governing universities) might prefer to isolate the school from liabilities incurred by the firm. Professor Colombo’s analysis, however, shows that it would be difficult for a separate non-profit to qualify for tax-exempt status. The precedents conflict, but “the bulk of these precedents indicate that organizations conducting commercial-like businesses as their primary activity will face considerable hostility from the IRS in seeking exempt status.” Even if a school-related law firm ultimately won the day, few law schools would want a new project like this to face IRS opposition.
Fortunately, there is a solution for schools located in states that allow law practices to function as limited liability companies (LLC’s). If the law school creates an LLC to house the firm, with the school as the LLC’s only member, then “the law school will receive the state-law liability protection of a limited-liability entity, while having the tax exemption issues analyzed as though the firm were operated as a ‘division’ of the law school.” The firm, in other words, would receive the school’s tax-exempt status.
I can’t pretend to evaluate Professor Colombo’s assessment; I’ve figured out relevant parts of the personal income tax, but don’t have a clue about the taxation of businesses or other organizations. Colombo, however, is a pro in this area, and his analysis is cogent–even readable for those of us who don’t commune daily with the Internal Revenue Code. Tax treatment is only factor in choosing organizational form, but it’s a significant one. Any law school considering creation of a postgraduate law firm should read Colombo’s concise perspective on organizational form and tax exemption.
The ABA Section of Legal Education’s Council voted unanimously today to defer any action on the Data Committee’s proposal to push back the date on which the ABA measures JD employment outcomes. We expressed our disapproval of this proposal over the last two days. Now others will have a chance to express their views to the Council before its August meeting. Measuring employment outcomes is important for schools, students, prospective students, graduates, and scholars who study the legal market. Any change from the current date requires careful evaluation–and, given the value of comparing outcomes over time, should have to overcome a strong presumption against change.
Kyle wrote yesterday about a proposal to push back the date on which law schools calculate their employment outcomes. Schools currently measure those outcomes on February 15 of each year, nine months after graduation. The proposal would nudge that date to March 15, ten months after graduation. The proposal comes from the Data Policy and Collection Committee of the ABA’s Section of Legal Education and Admissions to the Bar. The Section’s Council will consider the recommendation tomorrow.
Kyle explained how the committee’s report overlooks the needs of prospective law students, focusing instead on accommodating the interests of law schools. I agree with that critique and return to it below. First, however, I want to focus on some mistakes in the committee’s interpretation of the data provided to them by committee member Jerry Organ. Professor Organ was kind enough to share his spreadsheets with me, so I did not have to duplicate his work. He did an excellent job generating raw data for the committee but, as I explain here, the numbers cited by the committee do not support its recommendation. Indeed, they provide some evidence to the contrary.
The Committee’s Rationale and Data
The committee bases its recommendation on the facts that New York and California report bar results later than many other states, and that this hampers students seeking legal jobs in those markets. New York and California law schools, in turn, may have unduly depressed employment outcomes because their newly licensed graduates have less time to find jobs before February 15.
To substantiate this theory, the committee notes that “for graduates in the years 2011 and 2012, 18 of the bottom 37 schools in reported employment rates for the ‘Bar Passage Required, JD Advantage and Other Professional’ categories were located in New York and California.” This statistic is true for 2011, but not quite true for 2012: In 2012, the number is 15 out of 37 schools. But that’s a minor quibble. The bigger problem is that separating the results for California and New York creates a different picture.
California law schools are, in fact, disproportionately represented among the schools that perform worst on the employment metric cited by the committee. The committee examined 2011 employment statistics for 196 law schools and 2012 statistics for 198 schools. California accounted for twenty of the schools in 2011 (10.2%) and twenty-one of them in 2012 (10.6%). In contrast, the bottom 37 schools included 14 California schools in 2011 (37.8%) and 13 California schools in 2012 (35.1%). That’s a pretty large difference.
The New York law schools, on the other hand, are not disproportionately represented among the schools that performed worst on the committee’s reported metric. Fifteen of the examined schools (7.7% in 2011, 7.6% in 2012) are in New York state. The 37 schools that scored lowest on the employment metric, however, include only four New York schools in 2011 (10.8%) and two in 2012 (5.4%). One year is a little higher than we might predict based on the total number of NY schools; the other is a little lower.
Using the committee’s rudimentary analysis, in other words, the data show that one late-reporting state (California) is disproportionately represented among the bottom 37 schools, but another late-reporting state (New York) is not. That evidence actually cuts against the committee’s conclusion. If the timing of bar results accounts for the poor showing among California schools, then we should see a similar effect for New York schools. To compound this NY error, the committee mistakenly names Cardozo and Brooklyn as law schools that fall among the 37 lowest performing schools on the employment metric. Neither of those schools falls in that 37-school category in either year.
It’s possible that a different measure would show a disproportionate impact in New York. I haven’t had time to conduct other analyses; I simply repeated the one that the committee cites. Even if other analyses could show a discrepancy in New York, the committee’s reported data don’t line up with its conclusion. That’s a sloppy basis to support any action by the Section’s Council.
Better Analyses
If the committee (or Council) wants to explore the relationship between bar-result timing and employment outcomes, there are better ways to analyze the data provided by Professor Organ. This issue calls out for regression analysis: that technique could examine more closely the relationship between bar-result timing and employment outcomes, while controlling for factors like each school’s median LSAT, a measure of each school’s reputation, and the state’s bar passage rate. Regression is a routine tool used by many legal educators; it would be easy for the committee to supplement the dataset and conduct the analysis. That would be the best way to discern any relationship between the timing of bar results and employment outcomes.
But I have good news for the committee: There’s no need to improve the data analysis, because we already know enough to reject the proposed timing change.
What Really Matters?
Although the committee’s analysis is weak, I personally have no doubt that the timing of bar admission has some quantifiable relationship with employment outcomes. As the months roll on, more graduates find full-time, long-term professional employment (the outcome examined by the committee). In addition to the simple passage of time, we can all postulate that bar passage helps applicants secure jobs that require bar admission! The question isn’t whether there is some relationship between the timing of bar admission and employment outcomes. Even if that’s true, the questions for a policy-making committee are:
(a) How big is that effect compared to other effects?
(b) How much would a shift from February 15 to March 15 alter that effect?
(c) What negative impacts would that shift have?
(d) Do the costs outweigh the benefits?
Let’s take a look at each question.
How Big Is the Timing Effect?
We could answer this first question pretty precisely by doing the regression analysis outlined above. Without doing the additional data collection or math, I predict the following outcomes: First, median LSAT or law school reputation will show the greatest correlation with employment outcomes. In other words, each of those variables will correlate significantly with employment outcomes after controlling for other variables, and each of them will account for more variance in employment outcomes than any other variable in the equation. Second, bar passage rates will also have a significant impact on employment outcomes (again while controlling for other factors). Third, other factors (like measures of the strength of the entry-level legal market in each state) will also play a role in predicting employment outcomes. After controlling for factors like these, I predict that the timing of bar admission would show a statistically significant relationship with employment outcomes–but that it would be far from the weightiest factor.
I mentioned an important factor in that last paragraph, one that the committee report does not mention: bar passage rates. States have very different bar passage rates, ranging from 68.23% in Louisiana to 93.08% in South Dakota. (Both of those links will take you to Robert Anderson’s excellent analysis of bar exam difficulty. For purposes of this discussion, look at the far right-hand column, which gives actual pass rates.) When talking about employment outcomes, I suspect that differences in bar passage rates are far more important than differences in the timing of bar results. Waiting for bar results can slow down job offers, but flunking the bar hurts a lot more. People who fail the bar, in fact, may lose jobs they had already lined up.
California has the second lowest pass rate in the nation, second only to Louisiana (a state that is distinctive in many ways). Even graduates of ABA-accredited schools in California have a relatively low pass rate (76.9% for first-timers in July 2012) compared to exam-takers in other states. I suspect that much of the “California effect” detected by the ABA committee stems from the state’s high bar failure rate rather than its late reporting of bar results. Bar passage rates alone won’t fully explain differences in employment outcomes; I would perform a full regression analysis if I wanted to explore the factors related to those outcomes. But consider the relative impact of late results and poor results: Graduates who find out in November that they passed the bar may be a few weeks behind graduates in other states when seeking jobs. But graduates who find out in November that they failed the July bar have a whole different problem. Those graduates won’t be working on February 15, because they’ll be studying for the February bar.
California schools and graduates may face a bar timing problem, but they face a much larger bar passage problem. If we’re concerned with leveling the playing field for law schools, that’s a pretty rough terrain to tackle. As I suggest further below, moreover, the Data Committee shouldn’t worry about leveling the field for inter-school competition; after all, the ABA and its Section of Legal Education explicitly repudiate rankings. The committee should focus on the important task of gathering thoughtful data that informs accreditation and protects the public (including potential law students).
How Much Would the Date Shift Help?
Even if California (and maybe NY) schools have a problem related to the timing of bar results, how much would the proposed remedy help? Not very much. As Kyle pointed out yesterday, the date shift will give every school’s graduates an extra month to obtain full-time, long-term employment. Employment rates will go up for all schools, but will any difference between NY/California schools and other schools diminish? The committee actually could address that question with existing data, because there are several states that release bar results considerably earlier than other states. Do schools in those “early release” states have an employment advantage over other schools during October and November? If so, when does the advantage dissipate? A more refined regression analysis could suggest how much difference the proposed change would actually make.
I am relatively confident, meanwhile, that shifting the employment measurement date to March 15 would not address the bar-passage discrepancy I discuss above. The February bar exam occurs during the last week of the month. If low employment rates for California schools stem partly from a disproportionate number of graduates taking the February exam, a March 15 employment date doesn’t help much. Two weeks, give or take a day or two, isn’t much time to recover from the exam, apply for jobs, persuade an employer that you probably passed the exam you just took, and start work.
Negatives
What about downsides to the committee’s proposal? Kyle ably articulated four substantial ones yesterday. First, prospective students will receive employment information a month later, and this is a month that matters. Many law schools require seat deposits by May 1, and admitted students are actively weighing offers throughout April. Providing employment data in late April, rather than by March 31 (the current standard), leaves students waiting too long for important information. We should be striving to give prospective students information earlier in the spring, not later.
In fact, the committee’s report contains a helpful suggestion on this score: It indicates that law schools could submit March 15 employment data by April 7. If that’s true, then schools should be able to submit February 15 data by March 7–allowing the ABA to publish employment information a full week earlier than it currently does. Again, that’s a key week for students considering law school acceptances.
Second, the nine-month measurement day is already three months later than the day that would make most sense to prospective students and graduates. The grace period for repayment of direct unsubsidized loans ends six months after graduation; deferral of repayment for PLUS loans ends at the same time. For prospective students, a very important question is: What are the chances that I’ll have a full-time job when I have to start repaying my loans? We don’t currently answer that question for students. Instead, we tell them how many graduates of each law school have full-time jobs (and other types of jobs) three months after they’ve had to start repaying loans. If we’re going to change the reporting date for employment outcomes, we should move to six months–not ten. Schools could complement the six-month information with nine-month, ten-month, one-year, two-year, or any other measures. Employment rates at six months, however, would be most meaningful to prospective law students.
Third, changing the measurement day impedes comparisons over time. Partly for that reason, I haven’t advocated for a change to the six-month measure–although if change is on the table, I will definitely advocate for the six-month frame. The employment rates collected by the ABA allow comparison over time, as well as among schools. If schools begin reporting 10-month employment rates for the class of 2013, that class’s employment rate almost certainly will be higher than the class of 2012’s nine-month rate. But will the increase be due to improvements in the job market or to the shift in measurement date? If we want to comprehend changes in the job market, and that understanding is as important for schools as it is for students and graduates, there’s a strong reason to keep the measurement constant.
Finally, changing to a ten-month measurement date will make law schools–and their accrediting body–look bad. The committee’s report shows a great concern for “the particular hardship on law schools located in late bar results states,” the “current penalty on law schools who suffer from late bar results,” and the need for “a more level playing field” among those schools. There’s scant mention of the graduates who actually take these exams, wait for the results, search for jobs, remain unemployed nine months after graduation, and begin repaying loans before that date.
Prospective students, current students, graduates, and other law school critics will notice that focus–they really will. Why do law schools suddenly need to report employment outcomes after 10 months rather than nine? Is it because the information will be more timely for prospective students? Or because the information will be more accurate? No, it’s because some schools are suffering a hardship.
The Data Committee and Council need to pay more attention to the needs of students. From the student perspective, the “hardship” or “penalty” that some schools suffer is actually one that their graduates endure. If it takes longer to get a full-time lawyering job in NY or California than in North Carolina or New Mexico, that’s a distinction that matters to the graduates, not just the schools. It’s the graduates that will be carrying very large loans, with ever accumulating interest, for that extra month or two.
Similarly, if the real “penalty” stems from bar passage rates, that’s a difference that matters a lot to prospective students. It’s harder to pass the bar exam in California than in forty-eight other states and the District of Columbia. If you can’t pass the bar on your first try, your chances of working as a lawyer nine months after graduation fall significantly. Those are facts that affect graduates in the first instance, not law schools. They’re facts that prospective students need to know, not ones that we should in any way smooth over by creating a “level playing field” in which all graduates eventually obtain jobs.
Striking the Balance
The committee’s case for the proposed change is weak: the cited data don’t support the recommendation, the method of analyzing the data is simplistic, and the report doesn’t discuss costs of the proposal. Worse, law students and graduates will read the report’s reasoning as focused on the reputation of law schools, rather than as concerned about providing helpful, timely information to the people who we hope will work in our legal system. The committee could improve its analyses and reasoning, but the better move would be to reject the proposal and focus on more important matters.
On Friday, the ABA Section of Legal Education considers a recommendation from the section’s data policy committee about when schools collect graduate employment data. Instead of collecting data nine months after graduation, schools would collect data ten months after graduation.
The change looks minor, but it’s misguided. Though the council should dismiss the recommendation outright for reasons outlined below, the council needs to at least decline to act on the recommendation this week.
The committee’s reasoning is straightforward: some graduates don’t obtain jobs by the nine-month mark because some state bars have a slow licensing process. As committee chair Len Strickman puts it in the committee’s recommendation memo, the data policy change would have “the benefit of a more level playing field.”
Several New York and California deans have lobbied for the policy change because those jurisdictions release July bar results so late. Last year, California provided results on November 16th, with swearing-in ceremonies in the following weeks. New York provided results earlier, on November 1st, but many struggled to be sworn in for months.
A variety of employers, such as small firms and the state government, tend to hire licensed graduates. Compared to schools in states with a quicker credentialing process, New York and California schools are disadvantaged on current employment metrics. Changing the measurement date to mid-March instead of mid-February would allegedly take some bite out the advantage.
To check for a quantifiable advantage, the data policy committee considered two sets of data. First, the committee sorted schools by the percentage of 2012 graduates working professional jobs (lawyers or otherwise) as of February 15, 2013. Second, the committee sorted schools by the percentage of 2012 graduates who were unemployed or had an unknown employment status. For both measures, the committee determined that New York and California schools were disproportionally represented on the bad end of the curve.
Professor Strickman notes in his committee memo that many of the poorly-performing schools are “are broadly considered to be highly competitive schools nationally.” I’m not sure exactly what this means, but it sounds a lot like confirmation bias. Is he suggesting that the employment outcomes don’t match U.S. News rankings? The committee’s collective impression of how relatively well the schools should perform? Faculty reputation? It’s a mystery and without further support, not at all compelling.
Professor Strickman acknowledges that other factors may explain the relative placement. He does not name or address them. Here are some factors that may explain the so-called disadvantage:
(1) Graduate surplus (not just 2012, but for years);
(2) Attractiveness of certain states to graduates from out-of-state schools;
(3) Overall health of local legal markets;
(4) Graduate desirability;
(5) Ability of schools to fund post-graduation jobs.
Neither do we even know whether the rule revision would level the playing field. In other words, one extra month may not capture more professional job outcomes for graduates of New York and California schools than graduates of other schools. More time, after all, ought to produce better results for all schools with high under- and unemployment.
In sum, the committee should have declined to recommend the ten-month proposal until its proponents meet their burden of persuasion. The problem has not been well articulated, and the data do not support the conclusion.
Worse than recommending an unsupported policy change, the committee ignores the group for whom law schools produce job statistics: prospective students. Prospective students, students, and a society that depends on lawyers are the Section of Legal Education’s constituents. Calling the uneven playing field a “disadvantage,” “penalty,” and “hardship” for law schools shows from where the committee obtained its perspective.
(1) Is there a normative problem with an uneven playing field?
It’s not apparent that there’s an issue to resolve. Grant the committee its premise that state credentialing timelines affect performance on employment metrics. Is it the ABA’s job to ensure that schools compete with each other on a level playing field?
In one sense, yes, of course. When a school lies or cheats or deceives it gains an undeserved advantage and ABA Standard 509 prohibits this behavior. But it does not prohibit that behavior because of how it affects school-on-school competition. Prohibitions are a consequence of the ABA’s role in protecting consumers and the public.
The ABA was ahead of the curve when it adopted Standard 509 in the 1990’s. The organization interpreted its accreditation role to include communicating non-educational value to these constituents through employment information.
Here, the ABA failed to adequately consider the prospective students who want to make informed decisions, and the public which subsidizes legal education.
Prospective students received only a passing mention in Professor Strickman’s memo. In describing why the committee rejected several schools’ request to move the measurement back to one year, Professor Strickman’s explains:
The Data Policy and Collection Committee decided to reject this request because that length of delay would undermine the currency of data available to prospective law students.
As it happens, the committee’s chosen proposal also has a currency problem. The committee also failed to convey whether or how, if at all, it considered the change’s impact on the value of the consumer information.
(2) Does the new policy impede a prospective student’s ability to make informed decisions?
One of the ABA’s recent accomplishments was accelerating the publication of employment data. Previously, the ABA published new employment data 16 months after schools measured employment outcomes. In 2013, the ABA took only six weeks.
But if the Section of Legal Education adopts the ten-month proposal, it pushes data publication to the end of April—after many deposit deadlines and on the eve of others. While applicants should not overrate the importance of year-to-year differences, they should have the opportunity to evaluate the changes.
The new policy also makes the information less useful.
At one time, schools reported graduate employment outcomes as of six months after graduation. In 1996, NALP began measuring outcomes at nine months instead. The ABA, which at that time only asked schools to report their NALP employment rate, followed.
The six-month measurement makes far more sense than the nine-month date. Six months after graduating, interest accumulated during school capitalizes and the first loan payment is due. Ideally that six-month period would be used to pay down the accumulated interest so that less interest is paid later. The credentialing process makes this a rarity. Adding another month to the measurement makes the figure even less valuable.
Reducing comparability also dilutes the value of recent employment information. Students should not consider one year of data in isolation, but should analyze changes and the reasons for those changes. It’s for this reason that the ABA requires schools to publish three years of employment data as of last August.
The council needs to add additional viewpoints to the data policy committee. Right now, the committee is dominated by law school faculty and administrators. All twelve members are current faculty, deans, or other administrators. The name change from the “Questionnaire Committee” to the “Data Policy and Collection Committee” envisions a policy role for the group.
Just like the council, standards committee, and accreditation committee need a diversity of viewpoints, so too does the data policy committee. Perhaps if this diversity existed on the committee to begin with the new measurement date would not have been recommended too soon or at all.
As the council considers whose interests it serves and whether the data policy recommendation is ripe for adoption, I hope its members also consider the drivers of the policy beyond a law school lobby promoting its own interests.
The policy presupposes a reality where there are enough graduates who cannot derive economic value from their law degrees nine months after graduating that the ABA needs to modify its collection policy in order to count them.
Let me repeat that. It takes so long to become a lawyer that almost a year can pass and it’s reasonable to think many people are not yet using a credential they invested over three years of time, money, and effort to receive. A career is (hopefully) decades long, but the brutal reality of credentialing is that its costs go beyond what any fair system would contemplate. A change to the data policy as a solution would be funny were the economics of legal education not so tragic.
This piece was originally published by the ABA Journal.
Change is coming to a law school near you. Economics will drive the change, but the exact configuration will depend on choices made by law schools, state supreme courts, the ABA, and Congress over the next few years.
Without intervention, market forces are likely to segment law schools. Are schools and the profession content with that outcome? The question warrants serious debate.
Law schools have entered crisis mode as word spreads about their costs and job outcomes. In recent years, tens of thousands of graduates have struggled to enter the legal marketplace and find professional jobs with salaries that permit them to service student loan debt. As a result of a steep drop in applications and enrollment, schools face a budgetary crisis—one certain to change the face of legal education. We can bend the future, but only if reform happens through the lens of fixing law school economics.
The drivers of this change are on course to stratify legal education for lawyers into two layers.
One group of law schools—perhaps a few dozen “elite” schools—will continue using the traditional model. Research faculties will teach high-achieving students from around the country and world. Graduates from these schools will continue to obtain the most competitive jobs after achieving traditional market signals like high GPA and law review membership.
These schools will be cheaper by today’s standards, yet expensive by any reasonable measure. Classes will follow a curriculum designed using core lawyering competencies and will involve more simulations and more writing.
Overall, elite schools won’t look much different than today’s law school—a professional and graduate school hybrid that tries to simultaneously serve both the legal profession and the pursuit of knowledge. Nevertheless, they will feel different because the educational product will be more skills-oriented.
The second group of law schools—perhaps a few hundred “local” schools, including new ones—will use a model centered on teaching faculty. These schools will have similar educational approaches to the elite schools, but look much different. The faculty will be hired for their experience as lawyers, judges, regulators and policy wonks. Scholarship may not be part of the job description, but will endure because the desire to analyze the world around you is human nature. The schools may teach undergraduates, paralegals, and other professionals in addition to lawyers. Ultimately, local leaders and lawyers will shape an education that is less graduate studies and more professional development.
Affordability will be a feature, but local schools will be defined by the ownership the local legal community takes in educating future members. The result will be a faculty that fluidly moves between practicing and teaching.
A transient faculty will provide opportunities, but also a set of challenges for these schools, particularly how to ensure a high-quality, consistent product that’s capable of teaching each student what they need to succeed. To overcome some challenges, schools will share faculty—sometimes across town, sometimes across time zones—and course materials because it’s more efficient than trying to hire for every need and having part-time teachers reinvent the wheel each term.
Although it’s the broken economics of law school accelerating reform discussions, demands for change concern just about every aspect of law school and come from diverse perspectives. Many stakeholders view the crisis as an opportunity to shape the future. Not everything needs to or will change, but widespread dissatisfaction has put everything on the table.
There are three main drivers of change, each tied to the future I’m predicting:
First, the cost of becoming a lawyer is too high. Tuition skyrocketed because law schools operated in a completely dysfunctional market. Law students (and therefore law schools) had unfettered access to student loans with little downward pressure on the borrowing. Attitudes about student debt were unsophisticated and schools enjoyed an information asymmetry about post-graduation employment outcomes. While the loan system still provides blank checks, applicants now have credible employment information and are becoming increasingly price-sensitive.
As the applicant market becomes more functional, at-risk schools will cut their budgets to meet demand. Surviving schools will be those that accept the need to reinvent rather than rely on minor changes. Budgets are largely personnel-driven, so most schools will need to figure out how to more leanly deliver education. This will all but necessitate involvement from the local bench and bar.
This brings us to the second driver: the bench and bar. Practicing lawyers and judges are fed up with the quality of education. The steady drumbeat for more practical skills training isn’t new—in fact it’s a century old. But the opportunity for reshaping law schools is new because of the information about and coverage of their broken economics. The trouble: Creating a law school experience that the profession wants requires a redefinition of the law school mission. It must become more professional school than graduate school.
The opportunity stems partly from the third driver: the legal profession’s structural transformation. The media began paying attention to law graduate struggles when it became apparent that even graduates of the country’s most elite schools struggled in “the new normal.” This accelerated the decline in the JD’s perceived value and invited a multitude of skeptical voices to shout their discontent.
Yet the structural change has been more gradual. Over many decades, practice has grown more complex and specialized. Technology, globalization and the unbundling of legal services have accelerated the change. The legal profession of the future looks different; so too will the education system that produces its members.
Upholding the broad and often elusive principles of the American legal system—such as equality, opportunity, and justice—requires a legal education system that’s not merely subservient to market forces. Successfully addressing the drivers of change without flattening essential principles depends on whether the solutions explored and adopted provide more than lip service to the broken economics of the modern law school.
If we lose sight of what’s causing the change, we may lose the opportunity to bend the course for the better.
Cafe Manager & Co-Moderator
Deborah J. Merritt
Cafe Designer & Co-Moderator
Kyle McEntee
Law School Cafe is a resource for anyone interested in changes in legal education and the legal profession.
Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.