You are currently browsing archives for the Data category.

Old Tricks

June 23rd, 2013 / By

From time to time, I like to read real books instead of electronic ones. During a recent ramble through my law school’s library, I stumbled across an intriguing set of volumes: NALP employment reports from the late nineteen seventies. These books are so old that they still have those funny cards in the back. It was the content, though, that really took my breath away. During the 1970s, NALP manipulated data about law school career outcomes in a way that makes more contemporary methods look tame. Before I get to that, let me give you the background.

NALP compiled its first employment report for the Class of 1974. The data collection was fairly rudimentary. The association asked all ABA-accredited schools to submit basic data about their graduates, including the total number of class members, the number employed, and the number known to be still seeking work. This generated some pretty patchy statistics. Only 83 schools (out of about 156) participated in the original survey. Those schools graduated 17,188 JDs, but they reported employment data for just 13,250. More than a fifth of the graduates (22.9%) from this self-selected group of schools failed to share their employment status with the schools.

NALP’s early publications made no attempt to analyze this selection bias; the reports I’ve examined (for the Classes of 1977 and 1978) don’t even mention the possibility that graduates who neglect to report their employment status might differ from those who provide that information. The reports address the representativeness of participating schools, but in a comical manner. The reports divide the schools by institutional type (e.g., public or private) and geographic region, then present a cross-tabulation showing the number and percentage of schools participating in each category. For the Class of 1977, participation rates varied from 62.5% to 100%, but the report gleefully declares: “You will note the consistently high percentage of each type of institution, as well as the large number of schools sampled. I believe we can safely say that our study is, in fact, representative!” (p. 7)

Anyone with an elementary grasp of statistics knows that’s nonsense. The question isn’t whether the percentages were “high,” it’s how they varied across categories. Ironically, at the very time that NALP published the quoted language, I was taking a first-year elective on “Law and Social Science” at my law school. It’s galling that law schools weren’t practicing the quantitative basics that they were already teaching.

NALP quickly secured more participating schools, which mooted this particular example of bad statistics. By 1978, NALP was obtaining responses from 150 of the 167 ABA-approved law schools. Higher levels of school participation, however, did not solve the problem of missing graduates. For the Classes of 1974 through 1978, NALP was missing data on 19.4% to 23.7% of the graduates from reporting schools. Blithely ignoring those graduates, NALP calculated the employment rate each year simply by dividing the number of graduates who held any type of job by the number whose employment status was known. This misleading method, which NALP still uses today, yielded an impressive employment rate of 88.1% for the Class of 1974.

But even that wasn’t enough. Starting with the Class of 1975, NALP devised a truly ingenious way to raise employment rates: It excluded from its calculation any graduate who had secured neither a job nor bar admission by the spring following graduation. As NALP explained in the introduction to its 1977 report: “The employment market for new attorneys does not consist of all those that have graduated from ABA-approved law schools. In order for a person to practice law, there is a basic requirement of taking and passing a state bar examination. Those who do not take or do not pass the bar examination should therefore be excluded from the employment market….” (p. 1)

That would make sense if NALP had been measuring the percentage of bar-qualified graduates who obtained jobs. But here’s the kicker: At the same time that NALP excluded unemployed bar no-admits from its calculation, it continued to include employed ones. Many graduates in the latter category held jobs that we call “JD Advantage” ones today. NALP’s 1975 decision gave law schools credit for all graduates who found jobs that didn’t require a law license, while allowing them to disown (for reporting purposes) graduates who didn’t obtain a license and remained jobless.

I can’t think of a justification for that–other than raising the overall employment rate. Measure employment among all graduates, or measure it among all grads who have been admitted to the bar. You can’t use one criterion for employed graduates and a different one for unemployed graduates. Yet the “NALP Research Committee, upon consultation with executive committee members and many placement directors from throughout the country” endorsed this double standard. (id.)

And the trick worked. By counting graduates who didn’t pass the bar but nonetheless secured employment, while excluding those who didn’t take the bar and failed to get jobs, NALP produced a steady rise in JD employment rates: 88.1% in 1974 (under the original method), 91.6% in 1975, 92.5% in 1976, 93.6% in 1977, and a remarkable 94.2% in 1978. That 94.2% statistic ignored 19.5% of graduates who didn’t report any employment status, plus another 3.7% who hadn’t been admitted to the bar and were known to be unemployed but, whatever.

NALP was very pleased with its innovation. The report for the Class of 1977 states: “This revised and more realistic picture of the employment market for newly graduated and qualified lawyers reveals that instead of facing unemployment, the prospects for employment within the first year of graduation are in fact better than before. Study of the profile also reveals that there has been an incremental increase in the number of graduates employed and a corresponding drop in unemployment during that same period.” (p. 21) Yup, unemployment rates will fall if you ignore those pesky graduates who neither found jobs nor got admitted to the bar–while continuing to count all of the JD Advantage jobs.

I don’t know when NALP abandoned this piece of data chicanery. My library didn’t order any of the NALP reports between 1979 and 1995, so I can’t trace the evolution of NALP’s reporting method. By 1996, NALP was no longer counting unlicensed grads with jobs while ignoring those without jobs. Someone helped them come to their senses.

Why bring this up now? In part, I’m startled by the sheer audacity of this data manipulation. Equally important, I think it’s essential for law schools to recognize our long history of distorting data about employment outcomes. During the early years of these reports, NALP didn’t even have a technical staff: these reports were written and vetted by placement directors from law schools. It’s a sorry history.

, View Comment (1)

Too Soon to Tell

June 21st, 2013 / By

At Washington & Lee, as at most schools right now, we would prefer that our students were more successful in obtaining employment. But the 2012 employment figures, unfortunate as they are, say nothing about our our curricular reform. It is simply too early, . . . much too early.

The 2012 numbers refer to the first full class to pass through the reformed third year curriculum. Ours is a slow-to-change profession. Employers as a group do not change their settled practices on a dime. Nothing in the employment numbers that we see for the next 3 to five years should be seen as reflecting on the reception given to the curriculum reform. No curricular reform I know of, including Langdell’s, changed settled practices of others overnight. (more…)

View Comments (4)

NALP Numbers

June 20th, 2013 / By

NALP, the National Association for Law Placement, has released selected findings about employment for the Class of 2012. The findings and accompanying press release don’t tell us much more than the ABA data published in late March, but there are a few interesting nuggets. Here are my top ten take-aways from the NALP data.

1. Law school leads to unemployment. I’m sorry to put that so bluntly, but it’s true. Even after adopting a very generous definition of employment–one that includes any work for pay, whatever the nature of the work, the number of hours worked per week, and the permanence (or lack thereof) of the position–only 84.7% of graduates from ABA-accredited schools were employed nine months after graduation. Almost one in six graduates had no job at all nine months after graduation? That statistic is beyond embarrassing.

Some of those graduates were enrolled in other degree programs, and some reported that they were not seeking work. Neither of those categories, however, should offer much comfort to law schools or prospective students. It’s true that yet another degree (say an LLM in tax or an MBA) may lead to employment, but those degrees add still more time and money to a student’s JD investment. Graduates who are unemployed and not seeking work, meanwhile, often are studying for the February bar exam–sometimes after failing on their first attempt. Again, this is not a comforting prospect for students considering law school.

Even if we exclude both of those categories, moreover, 10.7% of 2012 graduates–more than one in every ten–was completely unemployed and actively seeking work in February 2013. The national unemployment rate that month was just 7.7%. Even among 25-to-29-year olds, a group that faces higher than average unemployment, the most recent reported unemployment rate (for 2012) was 8.9%. Recent graduates of ABA-accredited law schools are more likely to be unemployed than other workers their age–most of whom have far less education.

2. Nine months is a long time. When responding to these dismal nine-month statistics, law schools encourage graduates to consider the long term. Humans, however, have this annoying need to eat, stay warm, and obtain health care in the present. Most of us would be pretty unhappy if we were laid off and it took more than nine months to find another job. How would we buy food, pay our rent, and purchase prescriptions during those months? For new graduates it’s even worse. They don’t have the savings that more senior workers may have as a cushion for unemployment; nor can they draw unemployment compensation. On the contrary, they need to start repaying their hefty law school loans six months after graduation.

When we read nine-month statistics, we should bear those facts in mind. Sure, the unemployed graduates may eventually find work. But most of them already withdrew from the workforce for three years of law school; borrowed heavily to fund those years; borrowed still more to support three months of bar study; sustained themselves (somehow) for another six months; and have been hearing from their loan repayment companies for three months. If ten percent are still unemployed and seeking work the February after graduation, what are they living on?

3. If you want to practice law, the outlook is even worse. Buried in the NALP releases, you’ll discover that only 58.3% of graduates secured a full-time job that required bar admission and would last at least a year. Even that estimate is a little high because NALP excludes from its calculation over 1000 graduates whose employment status was unknown. Three years of law school, three months of bar study, six months of job hunting, and more than two out of every five law graduates still has not found steady, full-time legal work. If you think those two wanted JD Advantage jobs, read on.

4. Many of the jobs are stopgap employment. Almost a quarter of 2012 graduates with jobs in February 2013 were actively looking for other work. The percentage of dissatisfied workers was particularly high among those with JD Advantage positions: forty-three percent of them were seeking another job. JD Advantage positions offer attractive career options for some graduates, but for many they are simply a way to pay the bills while continuing the hunt for a legal job.

5. NALP won’t tell you want you want to know. When the ABA reported similar employment statistics in March, it led with the information that most readers want to know: “Law schools reported that 56.2 percent of graduates of the class of 2012 were employed in long-term, full-time positions where bar passage was required.” The ABA’s press release followed up with the percentage of graduates in long-term, full-time JD Advantage positions (9.5%) and offered comparisons to 2011 for both figures. Bottom line: Nine months after graduation, about two-thirds of 2012 graduates had full-time, steady employment related to their JD.

You won’t find that key information in either of the two reports that NALP released today. You can dig out the first of those statistics (the percentage of the class holding full-time, long-term jobs that required bar admission), but it’s buried at the bottom of the second page of the Selected Findings. You won’t find the second statistic (the percentage of full-time, long-term JD Advantage jobs) anywhere; NALP reports only a more general percentage (including temporary and part-time jobs) for that category.

NALP’s Executive Director, James Leipold, laments disclosing even that much. He tells us that the percentage of full-time, long-term jobs requiring bar passage “is certainly not a fair measure of the value of a legal education or the return on investment, or even a fair measure of the success of a particular graduating class in the marketplace.” Apparently forgetting the ABA’s attention to this employment measure, Leipold dismisses it as “the focus of so much of the media scrutiny of legal education.”

What number does NALP feature instead? That overall employment rate of 84.7%, which includes non-professional jobs, part-time jobs, and temporary jobs. Apparently those jobs are a more “fair measure of the value of a legal education.”

6. Law students are subsidizing government and nonprofits. NALP observes that the percentage of government and public interest jobs “has remained relatively stable for more than 30 years, at 26-29%.” At the same time, it reports that most law-school-funded jobs lie in this sector. If the percentage of jobs has remained stable, and law schools are now funding some of those spots, then law schools are subsidizing the government and public interest work. “Law schools,” of course, means students who pay tuition to those schools. Even if schools support post-graduate fellowships with donor money, those contributions could have been used to defray tuition costs.

I’m all in favor of public service, but shouldn’t the taxpayers and charitable donors pay for that work? In the current scheme, law students are borrowing significant sums from the government, at high interest rates, so that they can pay tuition that is used to subsidize government and nonprofit employees. Call me old fashioned, but that seems like a complicated (and regressive) way to pay for needed services. Why not raise taxes on people like me, who actually earn money, rather than issue more loans to people who hope someday to earn money?

7. Don’t pay much attention to NALP’s salary figures. NALP reports some salary information, which the ABA eschews. Those tantalizing figures draw some readers to the NALP report–and hype the full $95 version it will release in August. But the salary numbers are more misleading than useful. NALP reports salary information only for graduates who hold full-time, long-term positions and who report their salaries. That’s a minority of law graduates: Last year NALP reported salaries for just 18,639 graduates, from a total class of 44,495. Reported salaries, therefore, represented just 41.9% of the class. The percentage this year is comparable.

That group, furthermore, disproportionately represents the highest salaries. As NALP itself recognizes, salaries are “disproportionately reported for those graduates working at large firms,” so median salaries are “biased upward.” Swallow any salary reports, in other words, with a tablespoon of salt.

8. After accounting for inflation, today’s reported salaries are lower than ones from the last century. Although NALP’s reported salaries skew high, they offer some guidance to salary trends over time. Unfortunately, those trends are negative. During the early nineteen nineties, the country was in recession and law firms hadn’t yet accelerated pay for new associates. The median reported salary for 1991 graduates was just $40,000. Accounting for inflation, that’s equivalent to a 2012 median salary of $67,428. The actual reported median for that class, however, was just $61,245. Even when today’s graduates land a full-time, steady job, they’re earning 9.2% less than graduates from the last century.

9. The lights of BigLaw continue to dim. NALP acknowledges the “‘new normal’ in which large firm hiring has recovered some but remains far below pre-recession highs.” The largest firms, those with more than 500 lawyers, hired more than 3,600 members of the Class of 2012, a total that modestly exceeded the number hired from the Class of 2011. Current employment, however, remains well shy of the 5,100 graduates hired from the Class of 2009. Meanwhile, a growing percentage of those BigLaw hires are staff attorneys rather than associates. These lower-status, lower-paid lawyers currently comprise 4% of new BigLaw hires, and they are “more common than just two years ago.”

Inflation, meanwhile, has eroded salaries for even the best paid associates in BigLaw. In 2000, NALP reported a median salary of $125,000 for graduates joining firms that employed more than 500 lawyers. Adjusting for inflation, that would be $166,662 for the Class of 2012. BigLaw associates won’t starve on the median $160,000 they’re actually earning, but they’re taking home less in real dollars than the associates who started at the turn of the century.

For associates joining the second tier of BigLaw, firms that employ 251-500 lawyers, the salary news is even worse. In 2000, those associates also reported a median salary of $125,000, which would translate to $166,662 today. The actual median, however, appears to be $145,000 (the same figure reported for 2011). That’s a decline of 13% in real dollars.

10. It goes almost without saying that these 2012 graduates paid much more for their law school education than students did in 1991, 2000, or almost any other year. Law school tuition has far outpaced inflation over the last three decades. It’s scant comfort to this class–or to the classes of 2010, 2011, or 2013–that heavy discounts are starting to ease tuition. These are classes that bought very high and are selling very low. There’s little that law schools can do to make the difference up to these graduates, but we shouldn’t forget the financial hardship they face. If nothing else, the tuition-jobs gap for these classes should make us commit to the boldest possible reforms of legal education.

, View Comments (2)

Council Declines to Act

June 7th, 2013 / By

The ABA Section of Legal Education’s Council voted unanimously today to defer any action on the Data Committee’s proposal to push back the date on which the ABA measures JD employment outcomes. We expressed our disapproval of this proposal over the last two days. Now others will have a chance to express their views to the Council before its August meeting. Measuring employment outcomes is important for schools, students, prospective students, graduates, and scholars who study the legal market. Any change from the current date requires careful evaluation–and, given the value of comparing outcomes over time, should have to overcome a strong presumption against change.

, No Comments Yet

Data on the Proposed Date Change

June 6th, 2013 / By

Kyle wrote yesterday about a proposal to push back the date on which law schools calculate their employment outcomes. Schools currently measure those outcomes on February 15 of each year, nine months after graduation. The proposal would nudge that date to March 15, ten months after graduation. The proposal comes from the Data Policy and Collection Committee of the ABA’s Section of Legal Education and Admissions to the Bar. The Section’s Council will consider the recommendation tomorrow.

Kyle explained how the committee’s report overlooks the needs of prospective law students, focusing instead on accommodating the interests of law schools. I agree with that critique and return to it below. First, however, I want to focus on some mistakes in the committee’s interpretation of the data provided to them by committee member Jerry Organ. Professor Organ was kind enough to share his spreadsheets with me, so I did not have to duplicate his work. He did an excellent job generating raw data for the committee but, as I explain here, the numbers cited by the committee do not support its recommendation. Indeed, they provide some evidence to the contrary.

The Committee’s Rationale and Data

The committee bases its recommendation on the facts that New York and California report bar results later than many other states, and that this hampers students seeking legal jobs in those markets. New York and California law schools, in turn, may have unduly depressed employment outcomes because their newly licensed graduates have less time to find jobs before February 15.

To substantiate this theory, the committee notes that “for graduates in the years 2011 and 2012, 18 of the bottom 37 schools in reported employment rates for the ‘Bar Passage Required, JD Advantage and Other Professional’ categories were located in New York and California.” This statistic is true for 2011, but not quite true for 2012: In 2012, the number is 15 out of 37 schools. But that’s a minor quibble. The bigger problem is that separating the results for California and New York creates a different picture.

California law schools are, in fact, disproportionately represented among the schools that perform worst on the employment metric cited by the committee. The committee examined 2011 employment statistics for 196 law schools and 2012 statistics for 198 schools. California accounted for twenty of the schools in 2011 (10.2%) and twenty-one of them in 2012 (10.6%). In contrast, the bottom 37 schools included 14 California schools in 2011 (37.8%) and 13 California schools in 2012 (35.1%). That’s a pretty large difference.

The New York law schools, on the other hand, are not disproportionately represented among the schools that performed worst on the committee’s reported metric. Fifteen of the examined schools (7.7% in 2011, 7.6% in 2012) are in New York state. The 37 schools that scored lowest on the employment metric, however, include only four New York schools in 2011 (10.8%) and two in 2012 (5.4%). One year is a little higher than we might predict based on the total number of NY schools; the other is a little lower.

Using the committee’s rudimentary analysis, in other words, the data show that one late-reporting state (California) is disproportionately represented among the bottom 37 schools, but another late-reporting state (New York) is not. That evidence actually cuts against the committee’s conclusion. If the timing of bar results accounts for the poor showing among California schools, then we should see a similar effect for New York schools. To compound this NY error, the committee mistakenly names Cardozo and Brooklyn as law schools that fall among the 37 lowest performing schools on the employment metric. Neither of those schools falls in that 37-school category in either year.

It’s possible that a different measure would show a disproportionate impact in New York. I haven’t had time to conduct other analyses; I simply repeated the one that the committee cites. Even if other analyses could show a discrepancy in New York, the committee’s reported data don’t line up with its conclusion. That’s a sloppy basis to support any action by the Section’s Council.

Better Analyses

If the committee (or Council) wants to explore the relationship between bar-result timing and employment outcomes, there are better ways to analyze the data provided by Professor Organ. This issue calls out for regression analysis: that technique could examine more closely the relationship between bar-result timing and employment outcomes, while controlling for factors like each school’s median LSAT, a measure of each school’s reputation, and the state’s bar passage rate. Regression is a routine tool used by many legal educators; it would be easy for the committee to supplement the dataset and conduct the analysis. That would be the best way to discern any relationship between the timing of bar results and employment outcomes.

But I have good news for the committee: There’s no need to improve the data analysis, because we already know enough to reject the proposed timing change.

What Really Matters?

Although the committee’s analysis is weak, I personally have no doubt that the timing of bar admission has some quantifiable relationship with employment outcomes. As the months roll on, more graduates find full-time, long-term professional employment (the outcome examined by the committee). In addition to the simple passage of time, we can all postulate that bar passage helps applicants secure jobs that require bar admission! The question isn’t whether there is some relationship between the timing of bar admission and employment outcomes. Even if that’s true, the questions for a policy-making committee are:

(a) How big is that effect compared to other effects?
(b) How much would a shift from February 15 to March 15 alter that effect?
(c) What negative impacts would that shift have?
(d) Do the costs outweigh the benefits?

Let’s take a look at each question.

How Big Is the Timing Effect?

We could answer this first question pretty precisely by doing the regression analysis outlined above. Without doing the additional data collection or math, I predict the following outcomes: First, median LSAT or law school reputation will show the greatest correlation with employment outcomes. In other words, each of those variables will correlate significantly with employment outcomes after controlling for other variables, and each of them will account for more variance in employment outcomes than any other variable in the equation. Second, bar passage rates will also have a significant impact on employment outcomes (again while controlling for other factors). Third, other factors (like measures of the strength of the entry-level legal market in each state) will also play a role in predicting employment outcomes. After controlling for factors like these, I predict that the timing of bar admission would show a statistically significant relationship with employment outcomes–but that it would be far from the weightiest factor.

I mentioned an important factor in that last paragraph, one that the committee report does not mention: bar passage rates. States have very different bar passage rates, ranging from 68.23% in Louisiana to 93.08% in South Dakota. (Both of those links will take you to Robert Anderson’s excellent analysis of bar exam difficulty. For purposes of this discussion, look at the far right-hand column, which gives actual pass rates.) When talking about employment outcomes, I suspect that differences in bar passage rates are far more important than differences in the timing of bar results. Waiting for bar results can slow down job offers, but flunking the bar hurts a lot more. People who fail the bar, in fact, may lose jobs they had already lined up.

California has the second lowest pass rate in the nation, second only to Louisiana (a state that is distinctive in many ways). Even graduates of ABA-accredited schools in California have a relatively low pass rate (76.9% for first-timers in July 2012) compared to exam-takers in other states. I suspect that much of the “California effect” detected by the ABA committee stems from the state’s high bar failure rate rather than its late reporting of bar results. Bar passage rates alone won’t fully explain differences in employment outcomes; I would perform a full regression analysis if I wanted to explore the factors related to those outcomes. But consider the relative impact of late results and poor results: Graduates who find out in November that they passed the bar may be a few weeks behind graduates in other states when seeking jobs. But graduates who find out in November that they failed the July bar have a whole different problem. Those graduates won’t be working on February 15, because they’ll be studying for the February bar.

California schools and graduates may face a bar timing problem, but they face a much larger bar passage problem. If we’re concerned with leveling the playing field for law schools, that’s a pretty rough terrain to tackle. As I suggest further below, moreover, the Data Committee shouldn’t worry about leveling the field for inter-school competition; after all, the ABA and its Section of Legal Education explicitly repudiate rankings. The committee should focus on the important task of gathering thoughtful data that informs accreditation and protects the public (including potential law students).

How Much Would the Date Shift Help?

Even if California (and maybe NY) schools have a problem related to the timing of bar results, how much would the proposed remedy help? Not very much. As Kyle pointed out yesterday, the date shift will give every school’s graduates an extra month to obtain full-time, long-term employment. Employment rates will go up for all schools, but will any difference between NY/California schools and other schools diminish? The committee actually could address that question with existing data, because there are several states that release bar results considerably earlier than other states. Do schools in those “early release” states have an employment advantage over other schools during October and November? If so, when does the advantage dissipate? A more refined regression analysis could suggest how much difference the proposed change would actually make.

I am relatively confident, meanwhile, that shifting the employment measurement date to March 15 would not address the bar-passage discrepancy I discuss above. The February bar exam occurs during the last week of the month. If low employment rates for California schools stem partly from a disproportionate number of graduates taking the February exam, a March 15 employment date doesn’t help much. Two weeks, give or take a day or two, isn’t much time to recover from the exam, apply for jobs, persuade an employer that you probably passed the exam you just took, and start work.

Negatives

What about downsides to the committee’s proposal? Kyle ably articulated four substantial ones yesterday. First, prospective students will receive employment information a month later, and this is a month that matters. Many law schools require seat deposits by May 1, and admitted students are actively weighing offers throughout April. Providing employment data in late April, rather than by March 31 (the current standard), leaves students waiting too long for important information. We should be striving to give prospective students information earlier in the spring, not later.

In fact, the committee’s report contains a helpful suggestion on this score: It indicates that law schools could submit March 15 employment data by April 7. If that’s true, then schools should be able to submit February 15 data by March 7–allowing the ABA to publish employment information a full week earlier than it currently does. Again, that’s a key week for students considering law school acceptances.

Second, the nine-month measurement day is already three months later than the day that would make most sense to prospective students and graduates. The grace period for repayment of direct unsubsidized loans ends six months after graduation; deferral of repayment for PLUS loans ends at the same time. For prospective students, a very important question is: What are the chances that I’ll have a full-time job when I have to start repaying my loans? We don’t currently answer that question for students. Instead, we tell them how many graduates of each law school have full-time jobs (and other types of jobs) three months after they’ve had to start repaying loans. If we’re going to change the reporting date for employment outcomes, we should move to six months–not ten. Schools could complement the six-month information with nine-month, ten-month, one-year, two-year, or any other measures. Employment rates at six months, however, would be most meaningful to prospective law students.

Third, changing the measurement day impedes comparisons over time. Partly for that reason, I haven’t advocated for a change to the six-month measure–although if change is on the table, I will definitely advocate for the six-month frame. The employment rates collected by the ABA allow comparison over time, as well as among schools. If schools begin reporting 10-month employment rates for the class of 2013, that class’s employment rate almost certainly will be higher than the class of 2012’s nine-month rate. But will the increase be due to improvements in the job market or to the shift in measurement date? If we want to comprehend changes in the job market, and that understanding is as important for schools as it is for students and graduates, there’s a strong reason to keep the measurement constant.

Finally, changing to a ten-month measurement date will make law schools–and their accrediting body–look bad. The committee’s report shows a great concern for “the particular hardship on law schools located in late bar results states,” the “current penalty on law schools who suffer from late bar results,” and the need for “a more level playing field” among those schools. There’s scant mention of the graduates who actually take these exams, wait for the results, search for jobs, remain unemployed nine months after graduation, and begin repaying loans before that date.

Prospective students, current students, graduates, and other law school critics will notice that focus–they really will. Why do law schools suddenly need to report employment outcomes after 10 months rather than nine? Is it because the information will be more timely for prospective students? Or because the information will be more accurate? No, it’s because some schools are suffering a hardship.

The Data Committee and Council need to pay more attention to the needs of students. From the student perspective, the “hardship” or “penalty” that some schools suffer is actually one that their graduates endure. If it takes longer to get a full-time lawyering job in NY or California than in North Carolina or New Mexico, that’s a distinction that matters to the graduates, not just the schools. It’s the graduates that will be carrying very large loans, with ever accumulating interest, for that extra month or two.

Similarly, if the real “penalty” stems from bar passage rates, that’s a difference that matters a lot to prospective students. It’s harder to pass the bar exam in California than in forty-eight other states and the District of Columbia. If you can’t pass the bar on your first try, your chances of working as a lawyer nine months after graduation fall significantly. Those are facts that affect graduates in the first instance, not law schools. They’re facts that prospective students need to know, not ones that we should in any way smooth over by creating a “level playing field” in which all graduates eventually obtain jobs.

Striking the Balance

The committee’s case for the proposed change is weak: the cited data don’t support the recommendation, the method of analyzing the data is simplistic, and the report doesn’t discuss costs of the proposal. Worse, law students and graduates will read the report’s reasoning as focused on the reputation of law schools, rather than as concerned about providing helpful, timely information to the people who we hope will work in our legal system. The committee could improve its analyses and reasoning, but the better move would be to reject the proposal and focus on more important matters.

, No Comments Yet

Proposed Employment Data Change

June 5th, 2013 / By

On Friday, the ABA Section of Legal Education considers a recommendation from the section’s data policy committee about when schools collect graduate employment data. Instead of collecting data nine months after graduation, schools would collect data ten months after graduation.

The change looks minor, but it’s misguided. Though the council should dismiss the recommendation outright for reasons outlined below, the council needs to at least decline to act on the recommendation this week.

The Committee’s Justification

The committee’s reasoning is straightforward: some graduates don’t obtain jobs by the nine-month mark because some state bars have a slow licensing process. As committee chair Len Strickman puts it in the committee’s recommendation memo, the data policy change would have “the benefit of a more level playing field.”

Several New York and California deans have lobbied for the policy change because those jurisdictions release July bar results so late. Last year, California provided results on November 16th, with swearing-in ceremonies in the following weeks. New York provided results earlier, on November 1st, but many struggled to be sworn in for months.

A variety of employers, such as small firms and the state government, tend to hire licensed graduates. Compared to schools in states with a quicker credentialing process, New York and California schools are disadvantaged on current employment metrics. Changing the measurement date to mid-March instead of mid-February would allegedly take some bite out the advantage.

To check for a quantifiable advantage, the data policy committee considered two sets of data. First, the committee sorted schools by the percentage of 2012 graduates working professional jobs (lawyers or otherwise) as of February 15, 2013. Second, the committee sorted schools by the percentage of 2012 graduates who were unemployed or had an unknown employment status. For both measures, the committee determined that New York and California schools were disproportionally represented on the bad end of the curve.

Poorly Supported Justification

Professor Strickman notes in his committee memo that many of the poorly-performing schools are “are broadly considered to be highly competitive schools nationally.” I’m not sure exactly what this means, but it sounds a lot like confirmation bias. Is he suggesting that the employment outcomes don’t match U.S. News rankings? The committee’s collective impression of how relatively well the schools should perform? Faculty reputation? It’s a mystery and without further support, not at all compelling.

Professor Strickman acknowledges that other factors may explain the relative placement. He does not name or address them. Here are some factors that may explain the so-called disadvantage:

(1) Graduate surplus (not just 2012, but for years);
(2) Attractiveness of certain states to graduates from out-of-state schools;
(3) Overall health of local legal markets;
(4) Graduate desirability;
(5) Ability of schools to fund post-graduation jobs.

Neither do we even know whether the rule revision would level the playing field. In other words, one extra month may not capture more professional job outcomes for graduates of New York and California schools than graduates of other schools. More time, after all, ought to produce better results for all schools with high under- and unemployment.

In sum, the committee should have declined to recommend the ten-month proposal until its proponents meet their burden of persuasion. The problem has not been well articulated, and the data do not support the conclusion.

The Accreditor’s Role

Worse than recommending an unsupported policy change, the committee ignores the group for whom law schools produce job statistics: prospective students. Prospective students, students, and a society that depends on lawyers are the Section of Legal Education’s constituents. Calling the uneven playing field a “disadvantage,” “penalty,” and “hardship” for law schools shows from where the committee obtained its perspective.

(1) Is there a normative problem with an uneven playing field?

It’s not apparent that there’s an issue to resolve. Grant the committee its premise that state credentialing timelines affect performance on employment metrics. Is it the ABA’s job to ensure that schools compete with each other on a level playing field?

In one sense, yes, of course. When a school lies or cheats or deceives it gains an undeserved advantage and ABA Standard 509 prohibits this behavior. But it does not prohibit that behavior because of how it affects school-on-school competition. Prohibitions are a consequence of the ABA’s role in protecting consumers and the public.

The ABA was ahead of the curve when it adopted Standard 509 in the 1990’s. The organization interpreted its accreditation role to include communicating non-educational value to these constituents through employment information.

Here, the ABA failed to adequately consider the prospective students who want to make informed decisions, and the public which subsidizes legal education.

Prospective students received only a passing mention in Professor Strickman’s memo. In describing why the committee rejected several schools’ request to move the measurement back to one year, Professor Strickman’s explains:

The Data Policy and Collection Committee decided to reject this request because that length of delay would undermine the currency of data available to prospective law students.

As it happens, the committee’s chosen proposal also has a currency problem. The committee also failed to convey whether or how, if at all, it considered the change’s impact on the value of the consumer information.

(2) Does the new policy impede a prospective student’s ability to make informed decisions?

One of the ABA’s recent accomplishments was accelerating the publication of employment data. Previously, the ABA published new employment data 16 months after schools measured employment outcomes. In 2013, the ABA took only six weeks.

But if the Section of Legal Education adopts the ten-month proposal, it pushes data publication to the end of April—after many deposit deadlines and on the eve of others. While applicants should not overrate the importance of year-to-year differences, they should have the opportunity to evaluate the changes.

The new policy also makes the information less useful.

At one time, schools reported graduate employment outcomes as of six months after graduation. In 1996, NALP began measuring outcomes at nine months instead. The ABA, which at that time only asked schools to report their NALP employment rate, followed.

The six-month measurement makes far more sense than the nine-month date. Six months after graduating, interest accumulated during school capitalizes and the first loan payment is due. Ideally that six-month period would be used to pay down the accumulated interest so that less interest is paid later. The credentialing process makes this a rarity. Adding another month to the measurement makes the figure even less valuable.

Reducing comparability also dilutes the value of recent employment information. Students should not consider one year of data in isolation, but should analyze changes and the reasons for those changes. It’s for this reason that the ABA requires schools to publish three years of employment data as of last August.

Conclusion: Dismiss or Wait

The council needs to add additional viewpoints to the data policy committee. Right now, the committee is dominated by law school faculty and administrators. All twelve members are current faculty, deans, or other administrators. The name change from the “Questionnaire Committee” to the “Data Policy and Collection Committee” envisions a policy role for the group.

Just like the council, standards committee, and accreditation committee need a diversity of viewpoints, so too does the data policy committee. Perhaps if this diversity existed on the committee to begin with the new measurement date would not have been recommended too soon or at all.

As the council considers whose interests it serves and whether the data policy recommendation is ripe for adoption, I hope its members also consider the drivers of the policy beyond a law school lobby promoting its own interests.

The policy presupposes a reality where there are enough graduates who cannot derive economic value from their law degrees nine months after graduating that the ABA needs to modify its collection policy in order to count them.

Let me repeat that. It takes so long to become a lawyer that almost a year can pass and it’s reasonable to think many people are not yet using a credential they invested over three years of time, money, and effort to receive. A career is (hopefully) decades long, but the brutal reality of credentialing is that its costs go beyond what any fair system would contemplate. A change to the data policy as a solution would be funny were the economics of legal education not so tragic.

, View Comment (1)

The Federal Government’s Massive and Declining Investment in Legal Education

May 22nd, 2013 / By

Nowadays, law students borrow from the Department of Education Direct Loan Program for school. These loans are income-generating assets for the government. As such, I thought it would be interesting to see how large of an investment the federal government presently makes in law schools.

Based on my calculations, the total annual federal investment in law schools through student loans is currently $4.88 billion (2012-13 school year). Last year (2011-12), that number was $4.95 billion.

Calculating the annual investment required a sequence of estimates along with hard data. First, I used school-supplied data to calculate the government’s investment in 2012 graduates of ABA-approved schools. Second, I incorporated ABA-supplied enrollment data to estimate the government’s investment in all students enrolled during 2011-12. Finally, I used enrollment figures and tuition estimates to extend that projection to 2012-13, the academic year that just closed.

Students, of course, use their loans to cover living costs as well as law school tuition and fees. Law students, however, are forbidden from working during their first year, and find limited opportunities for paid work during the second and third years. Law schools can recruit students only as long as the students have a way to pay for both tuition and living expenses. It’s appropriate, therefore, to speak of educational loans to law students as an investment in law schools, not just students.

1. $4.43 Billion Federal Investment in 2012 Graduates of ABA-Approved Schools

Hard Data on 192 ABA-Approved Schools: $4.33 Billion. Students who graduated law school in 2012 borrowed at least $4.33 billion in federally-guaranteed and federal direct student loans to finance their legal education: that’s the amount of federal loan dollars processed and disbursed by 192 law schools to their 2012 graduates who borrowed for law school.

To calculate the amount loaned for each school (available in the table here), I took the number of graduates and multiplied it times the percentage of those graduates borrowing loans processed by the school. I rounded that number to the nearest whole graduate and multiplied it times the average amount borrowed for that school. The known federal government investment figures do not include students who never graduated and those enrolled in non-JD programs.

Here is a table that aggregates federal investment by school type:

Type Schools Accounted For Avg. Debt/Student(% of all grads borrowing) Total Federal
Investment
Private (Non-Profit) 110/113 (97.3%) $125,963 (84.2%) $3,064,183,905
Public School 77/81 (95.1%) $89,078 (83.8%) $1,110,978,434
Private
(For Profit)
5/5 (100%) $138,149 (91.7%) $150,167,940
All Types 192/199 (96.5%) $114,170 (84.3%) $4,325,330,279

The following schools did not report sufficient borrowing data to U.S. News: Barry University (Private, 200 grads), Florida A&M University (Public, 160 grads), Indiana University – Indianapolis (Public, 295 grads), Inter American University (Private, 234 grads), Pontifical Catholic University of Puerto Rico (Private, 217 grads), University of Puerto Rico (Public, 202 grads), University of The District of Columbia (Public, 93 grads).

Estimated Investment in 2012 Graduates of Seven Other ABA-Approved Law Schools: $107 million. The seven ABA-approved schools (immediately above) had 1,401 graduates in 2012, but did not provide sufficient data about student borrowing. Three were non-profit private schools (with 651 grads); four were public schools (with 750 grads).

To estimate total federal investment in these graduates I used the average amount borrowed and average percentage borrowing by school type. The result is 548 graduates of the private schools borrowing an average of $125,963 and 629 graduates of the public schools borrowing an average of $89,078, or $125.2 million total. Because the three schools in Puerto Rico are on average much cheaper than their U.S. counterparts, I also discounted the amount borrowed 25% for the public Puerto Rican school and 30% for the private ones. This reduced the total for these seven schools to about $107 million.

Adding that total to the $4.33 billion discussed above yields a grand total of $4.43 billion that the Department of Education invested in students who earned JD’s at ABA-accredited law schools in 2012.

2. 2011-2012 Federal Investment in All Enrolled JD Students: $4.95 Billion

Estimating the federal government’s annual investment in all enrolled students, rather than just graduates, required some arithmetic gymnastics. Here are the calculations for 2011-12, the most recent year for which we have information about borrowing:

The 46,360 graduates in 2012 (with 84.3% borrowing) borrowed $4.43 billion, but that borrowing was over a period of three years during which tuition and cost of living rose steadily. In other words, the $4.43 billion estimate is for students who were first years in 2009-10, second years in 2010-11, and third years in 2011-12. (These numbers account for part-time and joint-degree students by assuming that, overall, their enrollment was steady from year to year.)

I next determined how much the 2012 graduates borrowed just for 2011-12. From the time those graduates entered law school, tuition rose on average about 7% each year. Under that assumption, 2012 graduates borrowed 31.2% of the amount borrowed for the first year, 33.4% for the second year, and 35.4% for the third year. So, 35.4% of the average amount borrowed for 2012 graduates came during 2011-12. Multiply 35.4% times total federal investment in 2012 graduates of ABA-approved schools ($4.433 billion) and the result is $1.569 billion for 2012 graduates during their last year—or an average of $40,146 for each of the students who borrowed.

Assuming that 1Ls and 2Ls followed the same borrowing patterns as the students who graduated, we could estimate the federal government’s annual investment in JD students simply by multiplying $1.569 billion (the amount loaned to 2012 graduates) by three. That yields a total of $4.71 billion. That initial estimate is low, however, because it doesn’t account for attrition. The graduating class is smaller than 1L and 2L classes.

To get a more accurate estimate of the federal investment in all JD students enrolled during 2011-12, I took the ABA-reported total JD enrollment for 2011-12 (146,288) and deducted the number of graduates (46,360). That left 99,928 students who attended JD programs in 2011-12 but did not graduate that year. If those students borrowed in the same percentages as graduating students did, then 84,239 (84.3%) of them took federal loans. Multiplying that amount times the average amount borrowed ($40,146) yields $3.382 billion. The total amount invested in all JD students enrolled during the 2011-12 school year, therefore, was about $4.95 billion.

Note the assumption that the average price paid did not vary by class year. Note, too, that my calculation does not include students at schools not approved by the ABA but nevertheless eligible for Title IV student loans. Nor, finally, did I include students eligible for federal funds who enrolled in LLM or other non-JD programs administered by law schools.

3. 2012-2013 Federal Investment in All Enrolled JD Students: $4.88 Billion

The estimate for 2012-13 faced several additional hurdles. The 2011-12 estimate must be adjusted for tuition rises (which increase the average amount borrowed), changes in total enrollment (which declined substantially), and the percentage of all students borrowing (which I assumed was steady at 84.3%).

In 2010, 2011, and 2012 law schools enrolled new classes of 52,488 students, 48,697 students, and 44,518 students. Based on prior graduation, enrollment data, and past attrition data, I estimated that 47,000 students graduated in 2013. We know that 44,518 were in their first year so, with total enrollment at 139,362 students, about 47,844 students were in their second year.

I next estimated how much these students borrowed in 2012-13. The 2012 graduate had borrowed an average of $40,146 for the last year of law school. If we assume that this amount rose due to tuition increases by an extremely modest 5% for the 94,844 upper-level students (with 84.3% borrowing), the federal investment was $3.37 billion for those students. However, the first-year students (in the aggregate, at least) did not feel the brunt of the tuition increases. Tuition discounts, financed through the upper-level students, were needed to sway prospective students. I assumed that students who began school in fall 2012 borrowed no more for their first year than the 2012 graduates borrowed for their last year. Using that assumption, I estimated that the federal investment in the 44,518 first-year students (with 84.3% borrowing) was $1.507 billion.

That brings total federal loans for JD students to an estimated $4.88 billion for 2012-13. That’s a substantial investment, but note that it’s $70 million less than the federal investment in 2011-12. JD tuition revenue declined significantly during the last academic year.

4. Bonus: 2013-14 Federal Investment Speculation

In 2011 and 2012, law schools enrolled new classes of 48,697 students and 44,481 students. For the coming fall, the most common projection is just 38,000. Based on prior graduation, enrollment data, and past attrition data, I estimate that 43,800 students will graduate in 2014. Using the projection of 38,000 first-year students, I estimate total enrollment at 125,300 students, which would be the lowest since 2000.

What will those students pay for law school, and how much will they borrow? Schools are competing to maintain first-year enrollments, so I predict that incoming students will borrow no more than the ones who just finished their first year (an average of $40,146). If 84.3% of the class continues to borrow from the federal government, then these incoming 1Ls will borrow a total of $1.29 billion. If we assume that the 87,300 upper-level students borrow 5% more than they did in the current year, and continue borrowing in the same proportions, those students will borrow about $3.18 billion. The estimated total federal investment in JD students during 2013-14 is $4.47 billion. That’s a lot of cash, but it’s $410 million less than the estimate for 2012-13.

Note that this estimate doesn’t include any changes in borrowing for living expenses–other than the reduction in the number of students. If inflation increases the cost of living, or if students have more difficulty finding paid part-time employment, total borrowing may be somewhat higher than this estimate. On the other hand, if students reduce living costs, borrowing may be even lower than my projection. The biggest story, here, however, is the reduction in number of enrolled students combined with modulation of tuition.

Putting all of the numbers together, I estimate that the federal government invested $4.95 billion in JD students enrolled in ABA-approved law schools during 2011-12; that it invested $4.88 billion in those students during 2012-13; and that it will invest $4.47 billion in 2013-14.

Conclusion

The calculations grow hazier as we move from hard data to estimates, but they are good ballpark figures for the amounts that law students borrowed from the federal government during the past two years, as well as for the amounts they are likely to borrow during the coming year. Two conclusions immediately stick out to me.

First, the federal investment in legal education is a lot. Compared to the $112 billion in federal investment in all of higher education in FY2012, law schools are disproportionately funded. As the conversation heats up about law school economics and student loans, and whether the federal government thinks such an investment is justified or fair, the estimates provide an idea about the magnitude of the federal government investment.

Second, law schools have a lot less money to spend and it is only going to get worse this coming year. My estimates for 2012-13 and 2013-14 suggest that fewer students are enrolling and that they are paying less tuition. The largest law school class ever enrolled just graduated and it will be replaced by the smallest class in 40 or so years. To enroll the upcoming class, schools will also likely offer larger discounts than ever before—a number that has been growing very quickly. My projections suggest that law students will borrow $480 million less during 2013-14 than in 2011-12 from the federal government. That’s a loss of almost a half billion dollars caused by lower enrollment and heavily discounted tuition. Information can do wonders, even in a dysfunctional market.

Schools may make up for some lost revenue through non-JD programs, which continue to expand unregulated and quickly. Others will have to cut costs. Most law schools will survive, but they have difficult decisions ahead.

, View Comments (8)

Is BigLaw Reviving?

April 29th, 2013 / By

The American Lawyer has published the Am Law 100, its annual list of America’s highest revenue generating law firms. The accompanying article, titled “Spring Awakening,” suggests that BigLaw may have turned the tide to better times. The subtitle, in fact, states: “The Am Law 100’s modest gains hint that a fundamental recovery is taking root.”

BigLaw may be stabilizing, but the numbers don’t suggest any recovery in hiring levels for new associates. Revenue increases are modest, and headcount rose a miserly 0.8% over the last year. A detailed analysis of firm billing, meanwhile, declares that “[t]he most endangered species in The Am Law 100 appears to be the junior associate.”

Revenues

Gross revenue among the Am Law 100 increased 3.4% in 2012. Inflation, however, was 1.7%, which accounts for half of the increase. Average revenue per lawyer, meanwhile, increased just 2.6%, not much faster than inflation.

Profits per partner, notably, rose more than any other indicator. Those profits increased an average of 4.2%. Those increases make partners happy but, when partners take more than their share of gross revenue, there’s less money for hiring or compensating new lawyers.

Junior Associates

The day after publishing its Am Law 100 list, The American Lawyer released details of a study underscoring the decline of junior lawyers at those firms. The study analyzed a sample of bills submitted to BigLaw clients over the last three years. In 2010, those bills included hours billed by 3,322 junior lawyers (those with less than three years of experience). In 2012, the number of timekeepers in that category was just 2,327–a thirty percent decline.

The report notes that BigLaw clients have resisted paying for junior lawyers’ time, but finds that fact an unlikely explanation for a decline of this size. An associate would appear in this study even if the firm billed just a single hour of her time. It’s unlikely that any firm wrote off an entire year of work by a junior associate. The junior associates who did appear in the billing records, moreover, billed more time than midlevel or senior associates.

Instead, the most likely explanation for the decline is that firms have not been replacing entry-level lawyers. They are shifting work to staff attorneys, contract attorneys, and outsourcing firms. Or, when they hire new associates, they may be seeking ones who already have three years of experience.

Client Demand

The analysis of billing records reveals another grim fact: The large clients represented in the study hired BigLaw firms for fewer hours in 2012 than in 2011. In 2011 these clients bought 4.4 million hours from the studied firms; in 2012, they purchased just 4.3 million. That’s still significantly higher than the 3.7 million hours billed in 2010, but a dip between 2011 and 2012 does not bode well–especially for law students seeking associate positions at these firms. BigLaw clients may be handling more work in-house, solving problems through other means, hiring smaller firms, and turning directly to legal process outsourcers.

Conclusion

Most BigLaw firms are still vibrant, handling large amounts of work, and increasing profits for their partners. Business may have stabilized somewhat since the worst days of the recession. For law schools and new lawyers, however, any BigLaw recovery seems modest at best. At worst, in the words of The American Lawyer‘s columnist, junior associates are an “endangered species” in BigLaw.

, No Comments Yet

Straight Talk About JD Advantage Jobs

April 28th, 2013 / By

Earlier this month, I expressed my concern about NALP‘s aggressive marketing of JD Advantage jobs to pre-law students. Last week NALP posted additional information about these jobs on its website. Although some of the data are interesting, NALP is still withholding key information it possesses about JD Advantage jobs: law graduates are much less satisfied with these jobs than with ones that require bar admission.

The omission is both regrettable and deceptive. NALP has published much of the data it collects on JD Advantage jobs, while ignoring some of the most negative–and relevant–information in its possession. This biased disclosure reflects poorly on NALP, but it also embarrasses us as legal educators and professionals. NALP is a membership organization composed of law schools and legal employers, so it speaks for us. The last thing that law schools need, after years of bad press about distorted job statistics, is publication of more misleading data.

As educators, we care about both our graduates’ welfare and the accuracy of data. NALP’s dissembling with respect to JD Advantage jobs raises real questions about whether it is capable of continuing to represent our interests. Perhaps it is time for law schools to create a different organization–or work solely with the ABA–to collect and publish unbiased data about the careers of law school graduates. We need that information, not only to advise prospective and current students, but to guide our own decisions about how to reshape legal education. Feel-good presentations that omit key facts will not help us confront the ongoing challenges to our schools and profession.

I summarize here some of the information we currently have about JD Advantage jobs, including the data omitted by NALP. I also suggest ways that we could begin collecting more objective data about these jobs. If JD Advantage jobs are going to play an important role in the future of legal education, we have to get serious about examining these positions.

Looking Back

There have always been law graduates who pursued careers outside of law practice. The information we have, however, suggests that most of those graduates embraced alternative careers after practicing law for at least a few years. An earlier statement by NALP, for example, acknowledges: “It is certainly true that people with JD degrees work in a wide variety of alternative careers. However, while that may be true down the road, lawyers most often choose a non-traditional path after practicing law for at least a few years.” (This statement still appears on the NALP website, but it is not connected to the pages promoting JD Advantage jobs as entry-level positions.)

This distinction is important. Graduates who take JD Advantage jobs after practicing law differ from those who seek these jobs immediately after law school. The historical record suggests that some employers value the JD plus law practice experience for certain jobs; the record tells us very little about the value of the JD alone for those career paths. When we advise current and prospective students about the value of JD Advantage jobs, we have to be careful to distinguish graduates who used their degrees plus practice from those who attempted to secure JD Advantage jobs immediately after law school. A graduate who takes a job in “compliance” right after graduation has a very different job from one who moves in-house to do compliance work after three years in a regulatory law practice. Their long-term career trajectories may also differ; we have little available information on that score.

Earlier graduates in non-traditional positions offer an important resource for gathering information about JD Advantage jobs and, if those jobs seem promising, developing career paths for current graduates. We have to seek that information, however, in a serious way. It’s not enough simply to talk with these graduates at reunions. We need to map law-related opportunities more systematically, seek feedback on which law school experiences are particularly valuable for those jobs, and analyze objectively how much a JD contributes to graduates obtaining those positions and advancing in them.

JD Advantage Today

As entry-level jobs in law practice have contracted and shifted to less attractive positions, law graduates have looked to alternative fields. NALP’s Detailed Analysis of JD Advantage Jobs shows how important those jobs have become. Among 2011 graduates who reported their job status, 12.5% took JD Advantage jobs. That represents one out of every eight graduates. As a percentage of all graduates, including those who did not report their job status, graduates in JD Advantage positions accounted for 11.7% of the class.

According to recently released ABA figures, the percentage went up for the class of 2012. Among those who reported their employment status, 13.2% held JD Advantage jobs. As a percentage of the full graduating class, these jobs accounted for 12.9% of graduates.

Those percentages are substantially higher than the rates reported during the century’s first decade. For the class of 2001, 5.9% of graduates reporting their employment status indicated that they held “JD Preferred” jobs; that category was the precursor for the contemporary “JD Advantage” one. For the class of 2004, the figure was 7.5%, and in 2007, it was also 7.5%. The percentage edged up to 7.8% for the class of 2008, then began jumping noticeably each year: to 8.8% for the class of 2009, 10.2% for the class of 2010, 12.5% for the class of 2011, and 13.2% for the class of 2012.

This pattern in itself suggests that law graduates are turning to JD Advantage jobs as a “Plan B” when they cannot find jobs in law practice. Interest in these jobs has not been “growing steadily” since 2001, as NALP suggests in its recent analysis. Instead, interest jumped significantly after the recession hit the legal market in 2009. We need to look seriously at graduates’ satisfaction with JD Advantage jobs. Do recent graduates hope to build a career in this work? Or are they using JD Advantage jobs as place-holders while looking for work in law practice? If the latter, how well can graduates make that transition?

Job Satisfaction

NALP already has data on some of these questions. As part of its annual survey of law graduates, NALP asks employed graduates whether they are “seeking a job other than the one” reported to their Career Services Office. The answers to this question shed important light on a graduate’s job satisfaction. Graduates answer this survey within nine months of law school graduation. If they are seeking another job that quickly after graduation, the reported job either lacks permanence or holds little appeal.

Responses to this question consistently suggest that law graduates prefer jobs that require bar admission over JD Advantage ones. In 2001, just 6.7% of graduates working in lawyering jobs (those that required a law license) were looking for other work; a full third (33.3%) of those with JD Preferred jobs were actively seeking another job. In 2004, the percentages were 8.5% (for those in jobs requiring bar admission) and 37.0% (for JD Preferred jobs). Three years later, in 2007, the percentages were virtually identical to the 2004 ones: 8.7% of graduates with lawyering jobs were seeking other work, while 37.7% of those with JD Preferred positions were on the job market.

NALP’s latest figures, from 2011, show the same pattern. With a tighter market and more ad hoc jobs, the percentages have risen in both categories. 16.5% of graduates with lawyering jobs were seeking other work, and 46.8% of those with JD Advantage jobs were doing so. For graduates with other types of professional employment, the percentage was even higher: more than half (52.1%) of those graduates were sufficiently dissatisfied with their jobs to be seeking a different one.

These figures further suggest that JD Advantage positions are fallback jobs, rather than affirmative career decisions, for many graduates. Some graduates may eagerly pursue jobs in this category, but a large number do not. Almost half are seeking other work as soon as they begin these positions. Even among JD Advantage workers who have temporarily withdrawn from the job market, at least some may hope to move into law practice eventually.

This is essential information to know about the job market, but you won’t find the data on NALP’s web page offering a “Detailed Analysis of JD Advantage Jobs.” A prospective law student or interested law professor would have to purchase NALP’s $90 book on Jobs and JDs to find that information. The student or professor, of course, would also have to know that the additional data exist.

We need to grapple with negative information about JD Advantage jobs, not selectively ignore those data. Which graduates are satisfied with JD Advantage jobs and why? What work are the other graduates doing? Will that work help them secure jobs that better fulfill their career ambitions?

Toward Better Data

As noted above, I’m not sure that NALP is the best organization to collect more data on JD Advantage jobs or other evolving facets of the job market. The organization’s recent treatment of JD Advantage jobs suggests that it is spinning data rather than providing objective information. The ABA might serve as a better resource for ongoing career information. That professional group is providing data more quickly than NALP, and it is publishing the data in both summary and detailed form. Law School Transparency is also offering rapidly updated, objective career information through its Score Reports.

Whatever organizations we work with in the future, here are some questions that we need to address about JD Advantage jobs:

1. What are these jobs? Both NALP and the ABA allow graduates and their schools to decide whether a job qualifies for this category. It is very easy for a JD graduate or a JD-granting institution to conclude that their degree confers a “demonstrable advantage in obtaining or performing” a particular job. These decisions, however, may overstate the value of the JD. Is a job as a substitute middle school teacher a “JD Advantage” one? What about a job as a police officer? Law graduates in these jobs probably would draw upon their legal training, but are these the type of jobs we envision as “JD Advantage” ones?

There’s no reason to debate these questions in the abstract. We should simply require schools to list the jobs they have counted as “JD Advantage” ones. The ABA could publish that information, both for individual schools and in the aggregate. Some students may find positions as middle school teachers or police officers attractive; others may decide that the JD is not the best route to those positions. By publishing the data, we can inform both students and ourselves about possible career paths for law graduates.

2. How many students take different types of JD Advantage jobs? Law schools count paralegal positions as “JD Advantage” ones, but they rarely tout those jobs. Instead, websites tend to refer to policy analysts and investment bankers. Following the previous suggestion would allow us to advise students (and ourselves) about the prevalence of graduates in these very different JD Advantage positions.

3. How do other degrees and experiences contribute to graduates’ success in pursuing JD Advantage positions? A JD offers an advantage for some accounting positions, but it is very unlikely that a law graduate could obtain an accounting job without also holding a degree in accounting. Similarly, some JDs in business hold an MBA along with the JD. To give our students good counsel, as well as to enhance our own understanding of legal education, we need to collect more granular data about the relationship of JD Advantage jobs to other degrees. This research might suggest that other degrees shoulder much of the weight in securing some “JD Advantage” positions. Alternatively, it might identify particular joint degrees as especially useful for law students. The research might also suggest that we could benefit our students by incorporating elements of other degree programs in the JD curriculum.

4. How do law graduates fare in fields dominated by graduates with college or master’s degrees? According to the Department of Labor, only 20% of arbitrators, mediators, and conciliators hold a professional or doctoral degree; both BA and MA degrees are more common in this field. The Department does not even mention the JD as an educational prerequisite for a Human Resources Manager; 73% of those workers have just an associate’s or bachelor’s degree, while 27% possess a master’s degree. What do we mean, then, when we say that the JD provides an advantage for these positions? Do law graduates enter these fields at higher levels of responsibility than graduates with other preparation? Do they advance further? Based on anecdotal information, my sense is that the answer to both of these questions is “no.” The JD plus practice experience gives graduates an advantage in these fields, but the JD alone may not. But that’s just an impression; we need hard data on this issue.

Answering questions like these will help us advise prospective and current law students. Equally important, this information will inform our own decisions about the future of legal education. Is a three-year necessary for these JD Advantage jobs? Would a one- or two-year degree serve equally well? What elements of legal education contribute to these jobs? Is it critical thinking skills? Knowledge of legal doctrine? Both? How large are the contributions? We have to be willing to ask these questions as researchers and to interpret the answers objectively. Armed with that information, we can make responsible and productive decisions about how to improve the value of legal education.

, No Comments Yet

Update on Applications

April 14th, 2013 / By

LSAC has posted its count of law school applicants through Friday, April 5. The number of applicants had reached 52,066 by then, which was 15.9% less than the number at the same time last year. Applications have fallen more than applicants; the application total was 20.0% less than last year. The current application season is drawing to an end: By April 5 of 2012, law schools had received 96% of their applications from 91% of all applicants.

The numbers prompt these observations:

1. If current trends hold, we will finish the season with about 57,215 applicants, 15.9% less than last year’s total. That’s a stark decline, although not quite as steep as numbers suggested earlier in the season.

2. Law schools admitted 55,800 students just two years ago, when they had 78,500 students to choose from. If we admit the same number of students this year, almost every applicant will receive an offer.

3. The sharper decline in applications, compared to applicants, is noteworthy. It suggests to me that this year’s applicants are pickier than those in previous years; they are applying to fewer schools. Will that choosiness persist? If it does, schools may see lower yields on their offers than in previous years. That could depress class sizes more than the 15.9% drop in applicants suggests.

, No Comments Yet

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests