You are currently browsing archives for the Jobs category.

Deliberations for a prospective law student

August 21st, 2013 / By

This piece was published in this month’s National Jurist PreLaw Magazine.

If you’re thinking about applying to law school, you likely have two main questions. Can I get in? and Should I go? Lately, the answers have shifted for many individuals. A higher percentage of applicants are getting into law school; fewer are deciding to attend.

Getting into law school remains an achievement. It signals intelligence and determination. But before deciding to attend, you need to examine your options and aspirations carefully. Don’t choose law school just because you just because you got in, you like to argue, you’ve always wanted to help people or you don’t know what else to do. Think more specifically about what you hope to achieve with a law degree.

You can find that clarity through more than your imagination. Law school is a professional school, so go see what the professionals do. Shadow a variety of lawyers, meet dozens more and do anything you can to peer into the career paths of those who came before you.

Once you’ve developed some ideas about the type of career you want, check out employment outcomes for specific schools. The American Bar Association has a website with recent employment information for every accredited law school. Nationwide, only 56 percent of 2012 law graduates found full-time, long-term jobs requiring bar admission within nine months of graduation. But employment rates and job patterns vary across schools, so look carefully.

Geography matters. In recent years, about two-thirds of employed law graduates obtained their first job in the state in which their school was located. Think not just about where you want to attend school, but where you want to build a career. My nonprofit organization, Law School Transparency, developed a tool that will help you find the schools that send graduates to the cities or states where you want to work, www.LSTScoreReports.com. Employment data from the ABA and other sources are available on LST school profiles too.

Understand your student loans. The federal government originates almost all law school loans these days. Research how these loans work, such as what amount you’ll have to pay and when you will have to begin repayment. Also look at the tax implications. Boston University and Georgetown law schools have developed user-friendly calculators to help you compute debt loads and repayment plans. Consider what life will be like with debt, from the impact it may have on your career choices to your family planning or psychological well-being.

Finally, ignore all U.S. News & World Report law school rankings. At best, these rankings proxy various traits — all of which are better measured through analyzing raw data on LST, the ABA, or in the ABA-LSAC Official Guide.

Helping You Deliberate

Even with all of the sources listed above, it’s hard to make good decisions about investing in law school. When you look at salary statistics or other information on a law school website, how can you be sure they’ve presented the data fairly?

Law School Transparency recently announced a law school certification program that builds on the resources provided by the ABA and individual schools. The program centers on assuring fair representations about financial education and job statistics. We’ll certify the inaugural group of law schools in the coming months. Certified law schools will partner with LST to help students make educated decisions about whether and where to begin their legal career.

As participants, these schools will use our certification mark to signal their compliance with best practices for publishing vital employment and financial information. “LST Certified” will also demonstrate the school’s commitment to enrolling an informed student body.

Two primary goals inspire our program. First, we want you to have access to the information you need to critically evaluate your life-changing decision. The ABA’s law school accreditation standards require some important information, but not all that you want to know. The new LST Best Practices fill the gaps.

Second, we want students like you to trust the law schools that deserve your trust. Prospective students typically struggle to distinguish between schools that claim a comprehensive picture of job outcomes and costs, and those schools that actually provide one. These days, all law schools put their best foot forward to convey their value and distinctive offerings. But fierce competition drives questionable marketing tactics.

Law school is a huge investment that requires you to balance complicated costs, potential job and educational outcomes, and intangible benefits. ABA data, law school websites, the LST website, and LST’s new certification can all help you make the best decision. None of these sources can tell you whether to attend law school — or which school to attend — but they will aid your decision-making.

No Comments Yet

The Protectionism Premium

August 18th, 2013 / By

Brian Sheppard of Seton Hall Law School has raised an interesting point about any financial premium associated with the JD: How much of that premium rests on the legal profession’s restrictions to entry? At the end of his post, Sheppard suggests that an “empirical study of the effects of various protectionist measures would be a worthwhile” endeavor.

I know about one such analysis, conducted by Mario Pagliero. Pagliero, an economics professor at the University of Turin, explored the relationship between bar exam difficulty and entry-level salaries for U.S. lawyers. Pagliero’s study relies on the circumstance that most of our states administer a common set of bar questions (the Multistate Bar Exam or MBE), while establishing very different passing scores. States have also changed their passing scores over the last few decades, offering a robust dataset of passing scores that vary by state and time.

Pagliero compares these passing scores with entry-level salaries reported by NALP for corresponding states and times. In this way, he explores whether exam difficulty (as measured by passing score) bears any relationship to entry-level salary.

In an initial paper, Pagliero reports a clear relationship between the two: more difficult bar exams correlate with higher entry-level salaries. He also finds that lower pass-rates correspond with higher salaries, suggesting that the first correlation relates to supply rather than quality. More difficult bar exams, in other words, reduce the supply of lawyers. Reduced supply, in turn, raises entry-level salaries. By Pagliero’s calculation, a 1% increase in bar exam difficulty corresponds with a 1.7% increase in starting salaries.

In a second paper, Pagliero uses the same data to examine a frequently debated policy question: Do licensing standards help consumers by reducing information asymmetries (in other words, by providing information about quality that consumers cannot readily obtain on their own)? Or do the standards primarily serve the profession, by restricting entry and raising salaries? Based on the data and his modeling, Pagliero concludes that, for the U.S. legal profession, “licensing, as implemented, increases salaries and decreases the availability of lawyers, thus significantly reducing consumer welfare.” (p. 481)

This isn’t terribly surprising. The legal profession enjoys significant barriers to entry: applicants must master (and pay for) four years of college, three years of law school, and the bar exam. Increasingly, they must also devote time to low-paid or volunteer apprenticeships. These substantial barriers reduce competition, allowing lawyers to charge a premium for their services.

How can this premium persist in the face of under- and unemployed lawyers? It is difficult for many of those lawyers to compete against established firms. Ethics rules prohibit lawyers from using outside investments to build a practice; lawyers may take loans, but may not share profits with nonlawyers. New lawyers also lack access to adequate supervision unless they obtain jobs with existing firms. Law graduates who fail to obtain jobs at prevailing wages may seek work in other fields rather than attempting to undercut fees.

If at least some lawyer income stems from protectionism, that raises at least two important questions. First, is protectionism good policy? Pagliero’s model suggests that US courts protect lawyers, rather than the public, by limiting access to the profession. Based on these results, other authors have called for deregulation of the profession. As individuals struggle to obtain affordable legal services, and courts flounder in a growing sea of pro se litigants, calls to deregulate the profession will continue.

Second, even without formal deregulation, the work reserved exclusively for licensed lawyers is shrinking. Companies like Legalzoom and RocketLawyer, which provide low-cost legal documents to individuals and small businesses, are flourishing. Software like WillMaker is even cheaper. My husband and I produced a full set of wills, powers of attorney, and living wills for less than $25.

These services cannibalize the work available to lawyers serving individuals and small businesses. Corporate firms, meanwhile, are losing business to in-house compliance officers, HR officials, and others who administer complex regulations for their companies. Within BigLaw, US-licensed lawyers have lost jobs to software and overseas attorneys.

The biggest threat to lawyers’ historic livelihood comes, not from technology or globalization alone, but from the way in which those forces encroach upon work that once belonged exclusively to lawyers. Lawyers, like other workers, are seeing some of their jobs lost to computers or overseas workers. For us, however, the loss may be greater than for those in unregulated fields. To the extent our incomes depended partly on a protectionism premium, we may lose a significant part of that premium as consumers find ways to address legal needs without direct representation by lawyers.

, View Comment (1)

10-9 for Nine to Ten

August 10th, 2013 / By

By a narrow vote of 10-9, the ABA’s Legal Education Council has approved a proposal to move back the reporting date for new-graduate employment–from nine months after graduation to ten months after earning a degree. Kyle and I have each written about this proposal, and we each submitted comments opposing the change. The decision, I think, tells prospective students and the public two things.

First, the date change loudly signals that the entry-level job market remains very difficult for recent graduates, and that law schools anticipate those challenges continuing for the foreseeable future. This was the rationale for the proposal, that large firms are hiring “far fewer entry level graduates,” that “there is a distinct tendency of judges” to seek experienced clerks, and that other employers are reluctant to hire graduates until they have been admitted to the bar.

The schools saw these forces as ones that were unfairly, and perhaps unevenly, affecting their employment rates; they wanted to make clear that their educational programs were as sound as ever. From a prospective student’s viewpoint, however, the source of job-market changes doesn’t matter. An expensive degree that leads to heavy debt, ten months of unemployment, and the need to purchase still more tutoring for the bar, is not an attractive degree. Students know that the long-term pay-off, in job satisfaction or compensation, may be high for some graduates. But this is an uncertain time in both the general economy and the regulation of law practice; early-career prospects matter to prospective students with choices.

Second, and more disappointing to me, the Council’s vote suggests a concern with the comparative status of law schools, rather than with the very real changes occurring in the profession. The ABA’s Task Force on the Future of Legal Education has just issued a working paper that calls upon law faculty to “reduce the role given to status as a measure of personal and institutional success.” That’s a hard goal to reach without leadership from the top.

Given widespread acknowledgement that the proposal to shift the reporting date stemmed from changes in the US News methodology, we aren’t getting that leadership. Nor are we getting leadership on giving students the information they need, when they need it. This is another black eye for legal education.

, View Comments (2)

Financial Returns to Legal Education

July 21st, 2013 / By

I was busy with several projects this week, so didn’t have a chance to comment on the new paper by Michael Simkovic and Frank McIntyre. With the luxury of weekend time, I have some praise, some caveats, and some criticism for the paper.

First, in the praise category, this is a useful contribution to both the literature and the policy debates surrounding the value of a law degree. Simkovic and McIntyre are not the first to analyze the financial rewards of law school–or to examine other aspects of the market for law-related services–but their paper adds to this growing body of work.

Second, Simkovic and McIntyre have done all of us a great service by drawing attention to the Survey of Income and Program Participation. This is a rich dataset that can inform many explorations, including other studies related to legal education. The survey, for example, includes questions about grants, loans, and other assistance used to finance higher education. (See pp. 307-08 of this outline.) I hope to find time to work with this dataset, and I hope others will as well.

Now I move to some caveats and criticisms.

Sixteen Years Is Not the Long Term

Simkovic and McIntyre frequently refer to their results as representing “long-term” outcomes or “historic norms.” A central claim of the study, for example, is that the earnings premium from a law degree “is stable over the long term, with short term cyclical fluctuations.” (See slide 26 of the powerpoint overview.) These representations, however, rest on a “term” of just sixteen years, from 1996-2011. Sixteen years is less than half the span of a typical law graduate’s career; it is too short a period to embody long-term trends.

This is a different caveat from the one that Simkovic and McIntyre express, that we can’t know whether contemporary changes in the legal market will disrupt the trends they’ve identified. We can’t, in other words, know that the period from 2012-2027 will look like the one from 1996-2011. Equally important, however, the study doesn’t tell us anything about the years before 1996. Did the period from 1980-1995 look like the one from 1996-2011? What about the period from 1964-1979? Or 1948-1963?

The SIPP data can’t tell us about those periods. The survey began during the 1980s, but the instrument changed substantially in 1996. Nor do other surveys, to my knowledge, give us the type of information we need to perform those historical analyses. Simkovic and McIntyre didn’t overlook relevant data, but they claim too much from the data they do have.

Note that SIPP does contain data about law graduates of all ages. This is one of the strengths of the database, and of the Simkovic/McIntyre analysis. This study shows us the earnings of law graduates who have been practicing for decades, not just those of recent graduates. That analysis, however, occurs entirely within the sixteen-year window of 1996-2011. Putting aside other flaws or caveats for now, Simkovic and McIntyre are able to describe the earnings premium for law graduates of all ages during that sixteen-year window. They can say, as they do, that the premium has fluctuated within a particular band over that period. That statement, however, is very different than saying that the premium has been stable over the “long term” or that this period sets “historic norms.” To measure the long term, we’d want to know about a longer period of time.

This matters, because saying something has been “stable over the long term” sounds very reassuring. Sixteen years, however, is less than half the span of a typical law graduate’s career. It’s less, even, than the time that many graduates will devote to repaying their law school loans. The widely touted Pay As You Earn program extends payments over twenty years, while other plans structure payments over twenty-five years. Simkovic and McIntyre’s references to the “long term” suggest a stability that their sixteen years of data can’t support.

What would a graph of truly long-term trends show? We can’t know for sure without better data. The data might show the same pattern that Simkovic and McIntyre found for recent years. On the other hand, historic data might reveal periods when the economic premium from a law degree was small or declining. A study of long-term trends might also identify times when the JD premium was rising or higher than the one identified by Simkovic and McIntyre. A lot has changed in higher education, legal education, and the legal profession over the last 25, 50, or 100 years. That past may or may not inform the future, but it’s important to recognize that Simkovic and McIntyre tell us only about the recent past–a period that most recognize as particularly prosperous for lawyers–not about the long term.

Structural Shifts

Simkovic and McIntyre discount predictions that the legal market is undergoing a structural shift that will change lawyer earnings, the JD earnings premium, or other aspects of the labor market. Their skepticism does not stem from examination of particular workplace trends; instead it rests largely on the data they compiled. This is where Simkovic and McIntyre’s claim of stability “over the long term” becomes most dangerous.

On pp. 36-37, for example, Simkovic and McIntyre list a number of technological changes that have affected law practice, from “introduction of the typewriter” to “computerized and modular legal research through Lexis and Westlaw; word processing; electronic citation software; electronic document storage and filing systems; automated document comparison; electronic document search; email; photocopying; desktop publishing; standardized legal forms; will-making and tax-preparing software.” They then conclude (on p. 37) that “[t]hrough it all, the law degree has continued to offer a large earnings premium.”

That’s clearly hyperbole: We have no idea, based on the Simkovic and McIntyre analysis, how most of these technological changes affected the value of a law degree. Today’s JD, based on a three-year curriculum, didn’t exist when the typewriter pioneered. Lexis, WestLaw, and word processing have been around since the 1970s; photocopying dates further back than that. A study of earnings between 1996 and 2011 can’t tell us much about how those innovations affected the earnings of law graduates.

It is true (again, assuming for now no other flaws in the analysis) that legal education delivered an earnings premium during the period 1996-2011, which occurred after all of these technologies had entered the workforce. Neither typewriters nor word processors destroyed the earnings that law graduates, on average, enjoyed during those sixteen years. That is different, however, from saying that these technologies had no structural effect on lawyers’ earnings.

The Tale of the Typewriter

The lowly typewriter, in fact, may have contributed to a major structural shift in the legal market: the creation of three-year law schools and formal schooling requirements for bar admission. Simkovic and McIntyre (at fn 84) quote a 1901 statement that sounds like a melodramatic indictment of the typewriter’s impact on law practice. Francis Miles Finch, the Dean of Cornell Law School and President of the New York State Bar Association, told the bar association in 1901 that “current conditions are widely and radically different from those existing fifty years ago . . . the student in the law office copies nothing and sees nothing. The stenographer and the typewriter have monopolized what was his work . . . and he sits outside of the business tide.”

Finch, however, was not wringing his hands over new technology or the imminent demise of the legal profession; he was pointing out that law office apprentices no longer had the opportunity to absorb legal principles by copying the pleadings, briefs, letters, and other work of practicing lawyers. Finch used this change in office practices to support his argument for new licensing requirements: He proposed that every lawyer should finish four years of high school, as well as three years of law school or four years of apprenticeship, before qualifying to take the bar. These were novel requirements at the turn of the last century, although a movement was building in that direction. After Finch’s speech, the NY bar association unanimously endorsed his proposal.

Did the typewriter single-handedly lead to the creation of three-year law schools and academic prerequisites for the bar examination? Of course not. But the changing conditions of apprentice work, which grew partly from changes in technology, contributed to that shift. This structural shift, in turn, almost certainly affected the earnings of aspiring lawyers.

Some would-be lawyers, especially those of limited economic means, may not have been able to delay paid employment long enough to satisfy the requirements. Those aspirants wouldn’t have become lawyers, losing whatever financial advantage the profession might have conferred. Those who complied with the new requirements, meanwhile, lost several years of earning potential. If they attended law school, they also transferred some of their future earnings to the school by paying tuition. In these ways, the requirements reduced earnings for potential lawyers.

On the other hand, by raising barriers to entry, the requirements may have increased earnings for those already in the profession–as well as for those who succeeded in joining. Finch explicitly noted in his speech that “the profession is becoming overcrowded” and it would be a “benefit” if the educational requirements reduced the number of lawyers. (P. 102.)

The structural change, in other words, probably created winners and losers. It may also have widened the gap between those two groups. It is difficult, more than a century later, to trace the full financial effects of the educational requirements that our profession adopted during the first third of the twentieth century. I would not, however, be as quick as Simkovic and McIntyre to dismiss structural changes or their complex economic impacts.

Summary

I’ve outlined here both my praise for Simkovic and McIntyre’s article and my first two criticisms. The article adds to a much needed literature on the economics of legal education and the legal profession; it also highlights a particularly rich dataset for other scholars to explore. On the other hand, the article claims too much by referring to long-term trends and historic norms; this article examines labor market returns for law school graduates during a relatively short (and perhaps distinctive) recent period of sixteen years. The article also dismisses too quickly the impact of structural shifts. That is not really Simkovic and McIntyre’s focus, as they concede. Their data, however, do not provide the type of long-term record that would refute the possibility of structural shifts.

My next post related to this article will pick up where I left off, with winners and losers. My policy concerns with legal education and the legal profession focus primarily on the distribution of earnings, rather than on the profession’s potential to remain profitable overall. Why did law school tuition climb aggressively from 1996 through 2011, if the earnings premium was stable during that period? Why, in other words, do law schools reap a greater share of the premium today than they did in earlier decades?

Which students, meanwhile, don’t attend law school at all, forgoing any share in law school’s possible premium? For those who do attend, how is that premium distributed? Are those patterns shifting? I’ll explore these questions of winners and losers, including what we can learn about the issues from Simkovic and McIntyre, in a future post.
.

, View Comments (8)

Crucial Weaknesses

July 19th, 2013 / By

Clearly, Simkovic and McIntyre’s article has given new life to those who would defend the status quo. However, even assuming the statistical methodology is sound (which I do, as I have no reason to believe otherwise and no time to recreate it), the study suffers from a number of crucial weaknesses.

First, Part IV makes the assumption that current market challenges reflect no more than the historically cyclical nature of the legal market. If you do not agree with this assumption (and I do not–I think Susskind’s view on this issue is far more sound), then the entire study is fundamentally flawed. However, even if you buy this assumption, there remain further issues with the study.

The title itself, the “Million-Dollar Law Degree” is misleading at best. This million dollar figure reflects the mean value, where the mean is skewed significantly higher than the median. Thus, it overstates the value for significantly more than half of all JD grads. It also reflects “pre-tax” value, a point that the authors do not address until near the end of the article at Part V.C. There, the authors acknowledge that their calculated benefit must be divided between private “after-tax” earnings and public tax revenues. (more…)

No Comments Yet

New Study on Economic Value of Law Degree

July 17th, 2013 / By

I won’t spend much time summarizing the new paper by Michael Simkovic, an associate law professor at Seton Hall University School of Law, and Frank McIntyre, an assistant professor of finance and economics at Rutgers University Business School. Inside Higher Ed summarized the report just fine.

Instead, I want to comment on what I see as a misguided attempt to quell critics claiming that the law school investment is not a sound choice for many people. I hope Professor Simkovic and Professor McIntyre are correct that, on average and even down to the 25th percentile, the law school investment makes financial sense.

It just completely misses the point and grossly under-appreciates the human element.

Rather than viewing law degree holders in isolation, we can get better estimates of the causal effect of education by comparing the earnings of individuals with law degrees to the earnings of similar individuals with bachelor’s degrees while being mindful of the statistical effects of selection into law school.

Unfortunately, law degree holders are individuals who are sometimes (perhaps often) hurt by going to law school. Talking about groups necessarily smooths over the stories underneath the data—the ones that make you feel good and the ones that make you sick to your stomach. The reality is that there are many people that have been hurt and are hurt right now as a direct consequence of the costs associated with entering the legal profession (or trying to). These graduates very well may make more money in the long run. But this is hardly comforting to those considering law school and those who care about the people who do.

As I told Inside Higher Ed, law schools have made a habit out of capturing as much value out of their students as possible—and for a long time, used deceptive and immoral marketing tactics to do so. The dynamics are changing and should change because of the outrageously high price of obtaining a legal education. Even if an analysis shows an investment has a positive net present value in the long run, people are not like corporations. The short-term matters more for real people. Tens of thousands of law graduates leave school each year wondering how they’re going to manage to pay off their six-figure loans. That’s what motivates critics and frightens prospective law students.

Long-term value is not irrelevant, but using it to argue that education isn’t priced too high troubles me. If we think our society and our country are better for having an educated population, as these two authors do, then we had better stop pricing people out of education.

View Comments (6)

New Salary Data: Arkansas Law Schools

July 15th, 2013 / By

I wrote last week about a group of states that are using a “linked-records” method to collect detailed salary information for graduates of higher education. The method has some flaws, but it is improving rapidly. The databases, meanwhile, already contain information about graduates of fifteen law schools spread over five states. Let’s take a look, starting alphabetically with Arkansas.

Arkansas has two ABA-accredited law schools: the University of Arkansas at Fayetteville and the University of Arkansas at Little Rock. Both schools place a substantial majority of their graduates with employers in Arkansas, making them excellent candidates for the linked-records system. For the class of 2012, according to ABA data, 81 of Fayetteville’s 119 employed graduates (68.1%) took their first jobs in Arkansas. For the Little Rock campus, the figure was 85.3% (93 out of 109 employed graduates).

Average Salaries in Law

What salaries did those graduates earn? The College Measures database doesn’t have figures yet for 2012 graduates–or even for 2011 ones in Arkansas. But it does report the average first-year earnings of graduates from the classes of 2006 through 2010 who stayed in-state to work. For the Fayetteville campus, the average was $45,745, and for the Little Rock campus it was $47,060.

Those averages come with all of the caveats I mentioned in my earlier post: They exclude graduates working out of state, graduates holding federal jobs, and self-employed graduates. Perhaps most important, those averages include the legal market’s boom years of 2006 through 2009, along with just one down year. When the database incorporates salaries for the classes of 2011 through 2013, the averages may be lower.

Comparisons with Other Programs

Even including those boom years, however, the salaries of Arkansas law graduates suffer in comparison to starting salaries in other advanced degree programs. The Little Rock campus collected sufficient salary data from three different PhD programs: higher education administration, educational leadership, and physical sciences. The average starting salary in each of those programs was higher than in law, ranging from $52,726 in physical sciences to $72,134 in educational leadership.

To be fair, doctoral candidates in educational leadership or higher education administration often have significant workplace experience; they’re less likely than law students to move directly from college to graduate school. The salaries for these PhD’s, therefore, may partly reflect their workplace experience–not just the value of the degree. Still, eight of Little Rock’s undergraduate programs produced higher starting salaries than its law school did, topping out at $65,978 for registered nurses.

The story is similar at the Fayetteville main campus. There, five of seven doctoral programs produced higher starting salaries than law–and a sixth came within $500 of of law. I was surprised to see that the starting salaries of Arkansas law graduates compare unfavorably with those of graduates holding doctorates in adult and continuing education (average starting salary of $58,013), educational leadership ($85,245), and public policy analysis ($68,425). Even a master’s degree in political science produced an average starting salary ($44,202), within shouting distance of a law salary.

Equally depressing comparisons come from the University of Arkansas’s medical sciences campus. Dental hygienists with just an associate’s degree averaged higher starting salaries ($49,644) than law graduates from either Arkansas campus. A master’s in public health garnered, on average, $56,074. And doctors of pharmacy out-earned almost everyone with an average starting salary of $104,977.

Some of these careers, of course, may reach salary plateaus; it’s possible that Arkansas’s law graduates will earn more as their experience mounts. Even at the entry level, an Arkansas law degree continues to produce higher earnings than most undergraduate degrees. College graduates from the Fayetteville campus averaged just $33,956 during their first year in the workforce.

NALP Data

How do the linked-records salaries compare to ones reported to NALP? I couldn’t find salary information on either Arkansas law school’s website, but NALP’s Jobs and JDs book, available in hard cover, offers some interesting data. In 2007, law graduates working full-time in Arkansas reported an average salary of $49,966. That’s higher than the rolling averages compiled through the linked-records method, but not too far off. (Note that the NALP figures refer to all law graduates working in Arkansas, while the linked-records data include all Arkansas law graduates working in Arkansas. The salary pools, however, should be comparable.)

For 2011, on the other hand, NALP’s reported salaries seem quite high for Arkansas jobs. The reported mean is $52,806–more than six thousand dollars higher than the linked-records average for the boom years. It’s possible that the highest paying legal jobs in Arkansas are going to graduates of out-of-state schools. But it’s also quite likely, as NALP and law schools acknowledge, that the NALP-reported salaries skew high. That’s a good reason to support continued development of other methods for tracking salaries.

Below Minimum Wage

The last piece of information from the Arkansas linked-records database is particularly interesting. When calculating average salaries, Arkansas excluded any graduates who earned less than $13,195 per year, which is the state’s minimum wage threshold. Most employees earning less than that threshold are part-time or temporary workers. Including those salaries in a calculation of average full-time earnings would unfairly depress the average, so the researchers excluded these “below minimum wage” workers from the calculations.

Arkansas, however, does report the number of these “below minimum wage” workers for each degree program. Those numbers are depressingly high for the two law schools. Fifty-two of Little Rock’s graduates, 8.4% of all students who graduated between 2006 and 2010, earned less than $13,195 for the year that started six months after their graduation date. The percentage was the same for the Fayetteville campus: fifty-five graduates, or 8.4% of those who graduated between 2006 and 2010, earned less than minimum wage once they entered the workforce. That’s one in every twelve law graduates.

A few of these graduates may have worked in Arkansas for a few months and then moved to another state; that would produce a small amount of earnings in the Arkansas database. Others may have worked part-time for employers to supplement a solo practice or freelance work. The one in twelve figure, on the other hand, doesn’t include graduates who subsisted entirely on freelance wages or who found no paying work at all; those graduates don’t appear at all in the linked-records database.

Observations

What do we make of these data? The linked-records databases, like other sources of employment information, are incomplete. It is particularly difficult to distinguish unemployed graduates from those who have moved to other states–or to determine salary levels for the latter group of graduates. If researchers ultimately link databases across states, those connections would greatly improve the available information.

This brief examination of Arkansas data, meanwhile, illustrates the kind of comparisons facilitated by linked-records databases. Starting salaries for law graduates exceed those for most (although not all) college majors, but they lag behind salaries for many other advanced-degree holders. As we continue to debate reforms in legal education, we have to remember the options available to prospective students. Starting salaries are an important element in that calculus, one that students will be able to track more easily with databases like the ones available through College Measures.

View Comments (4)

New Salary Data

July 7th, 2013 / By

Law school critics have pressed schools to produce better information about the salaries earned by their graduates. Existing sources, as we know, provide incomplete or biased information. The Bureau of Labor Statistics (BLS) gathers data about lawyers’ salaries, but those reports omit solo practitioners, law firm partners, and law graduates who don’t practice law. Nor can we break down the BLS data to identify earnings by new lawyers or by graduates of particular schools.

The salary information gathered by NALP, in contrast, focuses on new graduates, includes graduates in non-practice jobs, and can be tied to particular schools (if a school chooses to publish their data). But these figures suffer from significant selection bias; NALP warns that these salaries “are biased upwards.”

Better salary information, however, is on the way. Researchers in other fields have found a new way to gather salary data about graduates of degree programs. The method hinges on the fact that employers pay unemployment taxes for each individual they employ. These taxes fund the pools used to support unemployment compensation. The government wants to make sure that it gathers its fair share of taxes, so employers report the wages they pay each individual. State unemployment compensation agencies, therefore, possess databanks of social security numbers linked to wages.

Educational institutions, similarly, possess the social security numbers of their graduates. It is possible, therefore, to use SSNs to link graduates with their salaries. The researchers doing this, of course, don’t examine the salaries of individual graduates. Instead, this “linked-records” approach allows them to generate aggregate salary data about graduates by college, major, year of degree, and several other criteria. The method also allows researchers to track salaries over time, both to see how entry-level salaries change and to track income as graduates gain workplace experience. For a brief overview of the method, see this paper from Berkeley’s Center for Studies in Higher Education.

The linked-record approach has the potential to generate very nuanced information about the financial pay-off of different educational programs. Salary information, in fact, is already available for several law schools. Before we get to that, however, let’s look more closely at the method’s wider application and its current limits.

Applications

California has used this research method to generate an extensive database of salary outcomes for graduates of its community college programs. Using the online “salary surfer,” you can discover that the highest earning graduates from those programs are individuals who earn a certificate in electrical systems and power transmission. Those graduates average $93,410 two years after certification and $123,174 five years out.

If you’re not willing to climb utility poles or hang out with high voltage wires, a plumbing certificate also pays off reasonably well in California, generating an average salary of $65,080 two years after graduation. That certificate, however, doesn’t seem to add more value with time–at least not during the early years of a career. Average salary for certified plumbers rises to just $65,299 five years after graduation.

Community college degrees in health-related work also generate substantial salaries. Degrees in the humanities, fine and applied arts, cosmetology, and travel services, on the other hand, are poor bets financially. Paralegal training falls in the middle: A paralegal degree from a California school yields an average salary of $38,191 two years after graduation and $42,332 five years out. Paralegal certificates, notably, generate higher wages. Those paralegals average $41,546 two years after certification and $47,674 after five years. I suspect that premium occurs because the certificate earners already hold four-year college degrees; they combine the paralegal certificate with a BA to earn more in the workplace.

You can spend hours with the California database, exploring the many subjects that community colleges teach and the varied financial pay-offs for those degrees. Let’s move on, however, to a much broader database.

The research organization College Measures is working with several states to identify salary outcomes for all types of post-secondary degrees. This database, like the one for California community colleges, relies upon the linked-records data collection method described above. The College Measures site currently includes schools in Arkansas, Colorado, Tennessee, Texas, and Virginia–with Florida and Nevada coming soon. The database doesn’t include every school or degree program in these states, but coverage is growing. Here are just a few findings to illustrate the detail available on the site:

* Chicken farming is a staple of the Arkansas economy, and the University of Arkansas’s main campus offers a BA in poultry science. Those degree holders average $37,251 during their first year after college–a little more than accounting BA’s from the same campus can expect to earn ($36,681).

* Arkansas, however, teaches much more than poultry science and accounting. Some of the highest earning graduates major in chemical engineering ($56,655), physics ($48,820), computer engineering ($45,589), and economics ($43,739). If you want to maximize income after graduation, on the other hand, stay away from majors in audiology ($20,417), classics ($20,842), and drama ($22,629).

* Moving to the Texas portion of the site, you won’t be surprised to discover that the most remunerative BA offered by the University of Texas at Austin is in Petroleum Engineering. Those graduates average $115,777 during their first year out of school.

* The least financially rewarding BA’s from the UT-Austin campus, at least initially, are general music performance ($11,098), Arabic Language and Literature ($17,192), and General Visual and Performing Arts ($17,749).

You can find similar results for other majors and schools in these states, as well as for schools in Colorado, Tennessee, and Virginia. Before continuing, however, let’s examine several key limits on the currently available data.

Limits

1. One State at a Time. The linked-records databases currently operate only within a single state: they can only identify salaries for graduates who work in the same state where they attended school. The Colorado database, for example, includes both of the state’s ABA-accredited law schools–but it reports only salaries for graduates who worked in Colorado the year after graduation.

This constraint will understate salaries for law schools that send a large number of graduates to other states for high-paying jobs. If Connecticut creates a database, for example, Yale Law School will receive no credit for the salaries of graduates who work in Massachusetts, New York, the District of Columbia, and other states. The University of Texas’s law school, currently included in the College Measures database, receives credit for salaries earned at BigLaw firms in Dallas or Houston–but not for those earned in Chicago, Los Angeles, or New York.

Researchers are working to overcome this limit by linking databases nationally. I suspect that will happen within the next year or two, making the linked-records method much more comprehensive. Meanwhile, the “one state” limit casts doubt on salary results for schools with a large number of graduates who leave the state.

For many law schools, however, even single-state salary reports can yield useful information. Most law schools place the majority of their graduates in entry-level jobs within the same state. All of the Texas law schools place more than half of their graduates with Texas employers. The same is true for the Arkansas law schools, Colorado schools, and two of the three Tennessee schools. Among the states for which linked-records data are currently available, only the Virginia law schools send a majority of their graduates out of state.

For law schools that place a majority of their graduates in-state, the linked-record databases provide a welcome perspective on a wide range of salaries. These databases include jobs with small law firms, local government, and small businesses. They will also identify law graduates with jobs outside of law practice. That’s a much wider scope than the salaries reported to NALP, which disproportionately represent large law firm jobs. Even if some of a school’s graduates leave the state, this in-state salary slice is likely to give prospective students a realistic perspective on the range of salaries earned by a school’s graduates.

2. Rolling Five-Year Averages. The linked-records databases report five-year averages, rather than average salaries for a single graduating class. This feature preserves anonymity in small programs and makes the data less “noisy.” The technique, however, can also mask dramatic market shifts.

This is particularly problematic in law, because average salaries rose dramatically from 2005 through 2009, and then plunged just as precipitously. Most of the states included in the College Measures database report the average salary for students who graduated in 2006 through 2010. For law graduates, those years include at least three high-earning years (2007 through 2009) and just one post-recession year (2010). The outdated averages on the College Measures site almost certainly overstate the amounts earned by more recent law school classes.

This problem, in my opinion, makes the salaries currently reported for law schools unreliable as predictors of current salaries. On the other hand, the data could be useful for other purposes. It would be instructive, for example, to compare each school’s linked-record average with an average of the salaries that school reported to NALP over the same five years. That comparison might indicate the extent to which NALP-reported salaries skew high. Within a few years, meanwhile, the linked-records databases will offer more useful salary projections for students considering law school. They will also help us see the extent to which salaries for law graduates have shifted over time.

3. Un- and Under-Employed Graduates. The linked-records databases do not reveal how many graduates are unemployed. Graduates who are missing from a state’s records may be unemployed or they may be working in another state. Researchers currently have no way to distinguish those two statuses.

As the research becomes more sophisticated, and especially if researchers are able to link records nationally, this problem will decrease. For now, users of the database have to remember that salaries reflect averages for employed graduates. Users need to search separately for the number of a school’s unemployed graduates.

For law schools, those figures are relatively easy to obtain because they appear on each school’s ABA employment summary. By combining that resource with the College Measures information, prospective students and others can determine the percentage of a law school’s graduates who were employed nine months after graduation, as well as the average salaries earned by graduates who worked in the same state as the school.

Underemployed graduates, those working in part-time or temporary jobs, do appear in most of the linked-record databases. This is a major advantage of the linked-record method: the method calculates each graduate’s annual earnings, even if those wages came from part-time or temporary work. If a graduate worked at more than one job, the linked records will aggregate wages from each of those jobs. The results won’t reveal how hard graduates had to work to generate their income, but database users will be able to tell how much on average they earned.

4. Excluded Workers. In addition to the caveats discussed above, the linked-records databases omit two important categories of workers. Most lack information about federal employees, although some states have started adding that information. Within a year or two, federal salaries should be fully integrated with other wages. For law school graduates, meanwhile, salaries for the most common federal jobs are already well known.

More significant, the linked-record databases do not include information about the self-employed. This omission matters more in some fields than others. Utility companies employ the workers who repair high-voltage power lines; you won’t find many free-lancers climbing utility poles. Plumbers, on the other hand, are more likely to set up shop for themselves.

For recent law graduates, the picture is mixed. Relatively few of them open solo practices immediately after graduation, but a growing number may work as independent contractors. The latter group, notably, may include graduates who receive career exploration grants from their schools. Depending on how those grants are structured, the graduates may not count as “employees” of either the school or the organization where they work; instead, they may be independent contractors. If that’s the case, their wages will not appear in the linked-record databases.

As experience grows with linked-record databases, it will be possible to determine how many law graduates fall outside of those records. It should be possible, for example, to compare the number of graduates who report in-state jobs to their schools with the number of in-state salaries recorded in a linked-record database. The difference between the two numbers will represent graduates who work as solos or independent contractors. The researchers creating these databases may also find ways to incorporate earnings data about self-employed graduates.

What About Law Schools?

Tomorrow, I will discuss salary information reported for the fifteen law schools currently included in the College Measures database. If you’re impatient, just follow the links. Those specific results, however, matter less than the overall scope of this salary-tracking method. The linked-record method promises much more sophisticated salary information than educational institutions have ever gathered on their own. The salaries can be tied to specific schools and degree programs. We (as well as prospective students and policymakers) will be able to compare financial outcomes across fields, schools, and states. As the databases grow in size, we will also be able to track salaries five, ten, fifteen, or twenty years after graduation. That amount of information is breathtaking–and a little scary.

, No Comments Yet

Old Tricks

June 23rd, 2013 / By

From time to time, I like to read real books instead of electronic ones. During a recent ramble through my law school’s library, I stumbled across an intriguing set of volumes: NALP employment reports from the late nineteen seventies. These books are so old that they still have those funny cards in the back. It was the content, though, that really took my breath away. During the 1970s, NALP manipulated data about law school career outcomes in a way that makes more contemporary methods look tame. Before I get to that, let me give you the background.

NALP compiled its first employment report for the Class of 1974. The data collection was fairly rudimentary. The association asked all ABA-accredited schools to submit basic data about their graduates, including the total number of class members, the number employed, and the number known to be still seeking work. This generated some pretty patchy statistics. Only 83 schools (out of about 156) participated in the original survey. Those schools graduated 17,188 JDs, but they reported employment data for just 13,250. More than a fifth of the graduates (22.9%) from this self-selected group of schools failed to share their employment status with the schools.

NALP’s early publications made no attempt to analyze this selection bias; the reports I’ve examined (for the Classes of 1977 and 1978) don’t even mention the possibility that graduates who neglect to report their employment status might differ from those who provide that information. The reports address the representativeness of participating schools, but in a comical manner. The reports divide the schools by institutional type (e.g., public or private) and geographic region, then present a cross-tabulation showing the number and percentage of schools participating in each category. For the Class of 1977, participation rates varied from 62.5% to 100%, but the report gleefully declares: “You will note the consistently high percentage of each type of institution, as well as the large number of schools sampled. I believe we can safely say that our study is, in fact, representative!” (p. 7)

Anyone with an elementary grasp of statistics knows that’s nonsense. The question isn’t whether the percentages were “high,” it’s how they varied across categories. Ironically, at the very time that NALP published the quoted language, I was taking a first-year elective on “Law and Social Science” at my law school. It’s galling that law schools weren’t practicing the quantitative basics that they were already teaching.

NALP quickly secured more participating schools, which mooted this particular example of bad statistics. By 1978, NALP was obtaining responses from 150 of the 167 ABA-approved law schools. Higher levels of school participation, however, did not solve the problem of missing graduates. For the Classes of 1974 through 1978, NALP was missing data on 19.4% to 23.7% of the graduates from reporting schools. Blithely ignoring those graduates, NALP calculated the employment rate each year simply by dividing the number of graduates who held any type of job by the number whose employment status was known. This misleading method, which NALP still uses today, yielded an impressive employment rate of 88.1% for the Class of 1974.

But even that wasn’t enough. Starting with the Class of 1975, NALP devised a truly ingenious way to raise employment rates: It excluded from its calculation any graduate who had secured neither a job nor bar admission by the spring following graduation. As NALP explained in the introduction to its 1977 report: “The employment market for new attorneys does not consist of all those that have graduated from ABA-approved law schools. In order for a person to practice law, there is a basic requirement of taking and passing a state bar examination. Those who do not take or do not pass the bar examination should therefore be excluded from the employment market….” (p. 1)

That would make sense if NALP had been measuring the percentage of bar-qualified graduates who obtained jobs. But here’s the kicker: At the same time that NALP excluded unemployed bar no-admits from its calculation, it continued to include employed ones. Many graduates in the latter category held jobs that we call “JD Advantage” ones today. NALP’s 1975 decision gave law schools credit for all graduates who found jobs that didn’t require a law license, while allowing them to disown (for reporting purposes) graduates who didn’t obtain a license and remained jobless.

I can’t think of a justification for that–other than raising the overall employment rate. Measure employment among all graduates, or measure it among all grads who have been admitted to the bar. You can’t use one criterion for employed graduates and a different one for unemployed graduates. Yet the “NALP Research Committee, upon consultation with executive committee members and many placement directors from throughout the country” endorsed this double standard. (id.)

And the trick worked. By counting graduates who didn’t pass the bar but nonetheless secured employment, while excluding those who didn’t take the bar and failed to get jobs, NALP produced a steady rise in JD employment rates: 88.1% in 1974 (under the original method), 91.6% in 1975, 92.5% in 1976, 93.6% in 1977, and a remarkable 94.2% in 1978. That 94.2% statistic ignored 19.5% of graduates who didn’t report any employment status, plus another 3.7% who hadn’t been admitted to the bar and were known to be unemployed but, whatever.

NALP was very pleased with its innovation. The report for the Class of 1977 states: “This revised and more realistic picture of the employment market for newly graduated and qualified lawyers reveals that instead of facing unemployment, the prospects for employment within the first year of graduation are in fact better than before. Study of the profile also reveals that there has been an incremental increase in the number of graduates employed and a corresponding drop in unemployment during that same period.” (p. 21) Yup, unemployment rates will fall if you ignore those pesky graduates who neither found jobs nor got admitted to the bar–while continuing to count all of the JD Advantage jobs.

I don’t know when NALP abandoned this piece of data chicanery. My library didn’t order any of the NALP reports between 1979 and 1995, so I can’t trace the evolution of NALP’s reporting method. By 1996, NALP was no longer counting unlicensed grads with jobs while ignoring those without jobs. Someone helped them come to their senses.

Why bring this up now? In part, I’m startled by the sheer audacity of this data manipulation. Equally important, I think it’s essential for law schools to recognize our long history of distorting data about employment outcomes. During the early years of these reports, NALP didn’t even have a technical staff: these reports were written and vetted by placement directors from law schools. It’s a sorry history.

, View Comment (1)

NALP Numbers

June 20th, 2013 / By

NALP, the National Association for Law Placement, has released selected findings about employment for the Class of 2012. The findings and accompanying press release don’t tell us much more than the ABA data published in late March, but there are a few interesting nuggets. Here are my top ten take-aways from the NALP data.

1. Law school leads to unemployment. I’m sorry to put that so bluntly, but it’s true. Even after adopting a very generous definition of employment–one that includes any work for pay, whatever the nature of the work, the number of hours worked per week, and the permanence (or lack thereof) of the position–only 84.7% of graduates from ABA-accredited schools were employed nine months after graduation. Almost one in six graduates had no job at all nine months after graduation? That statistic is beyond embarrassing.

Some of those graduates were enrolled in other degree programs, and some reported that they were not seeking work. Neither of those categories, however, should offer much comfort to law schools or prospective students. It’s true that yet another degree (say an LLM in tax or an MBA) may lead to employment, but those degrees add still more time and money to a student’s JD investment. Graduates who are unemployed and not seeking work, meanwhile, often are studying for the February bar exam–sometimes after failing on their first attempt. Again, this is not a comforting prospect for students considering law school.

Even if we exclude both of those categories, moreover, 10.7% of 2012 graduates–more than one in every ten–was completely unemployed and actively seeking work in February 2013. The national unemployment rate that month was just 7.7%. Even among 25-to-29-year olds, a group that faces higher than average unemployment, the most recent reported unemployment rate (for 2012) was 8.9%. Recent graduates of ABA-accredited law schools are more likely to be unemployed than other workers their age–most of whom have far less education.

2. Nine months is a long time. When responding to these dismal nine-month statistics, law schools encourage graduates to consider the long term. Humans, however, have this annoying need to eat, stay warm, and obtain health care in the present. Most of us would be pretty unhappy if we were laid off and it took more than nine months to find another job. How would we buy food, pay our rent, and purchase prescriptions during those months? For new graduates it’s even worse. They don’t have the savings that more senior workers may have as a cushion for unemployment; nor can they draw unemployment compensation. On the contrary, they need to start repaying their hefty law school loans six months after graduation.

When we read nine-month statistics, we should bear those facts in mind. Sure, the unemployed graduates may eventually find work. But most of them already withdrew from the workforce for three years of law school; borrowed heavily to fund those years; borrowed still more to support three months of bar study; sustained themselves (somehow) for another six months; and have been hearing from their loan repayment companies for three months. If ten percent are still unemployed and seeking work the February after graduation, what are they living on?

3. If you want to practice law, the outlook is even worse. Buried in the NALP releases, you’ll discover that only 58.3% of graduates secured a full-time job that required bar admission and would last at least a year. Even that estimate is a little high because NALP excludes from its calculation over 1000 graduates whose employment status was unknown. Three years of law school, three months of bar study, six months of job hunting, and more than two out of every five law graduates still has not found steady, full-time legal work. If you think those two wanted JD Advantage jobs, read on.

4. Many of the jobs are stopgap employment. Almost a quarter of 2012 graduates with jobs in February 2013 were actively looking for other work. The percentage of dissatisfied workers was particularly high among those with JD Advantage positions: forty-three percent of them were seeking another job. JD Advantage positions offer attractive career options for some graduates, but for many they are simply a way to pay the bills while continuing the hunt for a legal job.

5. NALP won’t tell you want you want to know. When the ABA reported similar employment statistics in March, it led with the information that most readers want to know: “Law schools reported that 56.2 percent of graduates of the class of 2012 were employed in long-term, full-time positions where bar passage was required.” The ABA’s press release followed up with the percentage of graduates in long-term, full-time JD Advantage positions (9.5%) and offered comparisons to 2011 for both figures. Bottom line: Nine months after graduation, about two-thirds of 2012 graduates had full-time, steady employment related to their JD.

You won’t find that key information in either of the two reports that NALP released today. You can dig out the first of those statistics (the percentage of the class holding full-time, long-term jobs that required bar admission), but it’s buried at the bottom of the second page of the Selected Findings. You won’t find the second statistic (the percentage of full-time, long-term JD Advantage jobs) anywhere; NALP reports only a more general percentage (including temporary and part-time jobs) for that category.

NALP’s Executive Director, James Leipold, laments disclosing even that much. He tells us that the percentage of full-time, long-term jobs requiring bar passage “is certainly not a fair measure of the value of a legal education or the return on investment, or even a fair measure of the success of a particular graduating class in the marketplace.” Apparently forgetting the ABA’s attention to this employment measure, Leipold dismisses it as “the focus of so much of the media scrutiny of legal education.”

What number does NALP feature instead? That overall employment rate of 84.7%, which includes non-professional jobs, part-time jobs, and temporary jobs. Apparently those jobs are a more “fair measure of the value of a legal education.”

6. Law students are subsidizing government and nonprofits. NALP observes that the percentage of government and public interest jobs “has remained relatively stable for more than 30 years, at 26-29%.” At the same time, it reports that most law-school-funded jobs lie in this sector. If the percentage of jobs has remained stable, and law schools are now funding some of those spots, then law schools are subsidizing the government and public interest work. “Law schools,” of course, means students who pay tuition to those schools. Even if schools support post-graduate fellowships with donor money, those contributions could have been used to defray tuition costs.

I’m all in favor of public service, but shouldn’t the taxpayers and charitable donors pay for that work? In the current scheme, law students are borrowing significant sums from the government, at high interest rates, so that they can pay tuition that is used to subsidize government and nonprofit employees. Call me old fashioned, but that seems like a complicated (and regressive) way to pay for needed services. Why not raise taxes on people like me, who actually earn money, rather than issue more loans to people who hope someday to earn money?

7. Don’t pay much attention to NALP’s salary figures. NALP reports some salary information, which the ABA eschews. Those tantalizing figures draw some readers to the NALP report–and hype the full $95 version it will release in August. But the salary numbers are more misleading than useful. NALP reports salary information only for graduates who hold full-time, long-term positions and who report their salaries. That’s a minority of law graduates: Last year NALP reported salaries for just 18,639 graduates, from a total class of 44,495. Reported salaries, therefore, represented just 41.9% of the class. The percentage this year is comparable.

That group, furthermore, disproportionately represents the highest salaries. As NALP itself recognizes, salaries are “disproportionately reported for those graduates working at large firms,” so median salaries are “biased upward.” Swallow any salary reports, in other words, with a tablespoon of salt.

8. After accounting for inflation, today’s reported salaries are lower than ones from the last century. Although NALP’s reported salaries skew high, they offer some guidance to salary trends over time. Unfortunately, those trends are negative. During the early nineteen nineties, the country was in recession and law firms hadn’t yet accelerated pay for new associates. The median reported salary for 1991 graduates was just $40,000. Accounting for inflation, that’s equivalent to a 2012 median salary of $67,428. The actual reported median for that class, however, was just $61,245. Even when today’s graduates land a full-time, steady job, they’re earning 9.2% less than graduates from the last century.

9. The lights of BigLaw continue to dim. NALP acknowledges the “‘new normal’ in which large firm hiring has recovered some but remains far below pre-recession highs.” The largest firms, those with more than 500 lawyers, hired more than 3,600 members of the Class of 2012, a total that modestly exceeded the number hired from the Class of 2011. Current employment, however, remains well shy of the 5,100 graduates hired from the Class of 2009. Meanwhile, a growing percentage of those BigLaw hires are staff attorneys rather than associates. These lower-status, lower-paid lawyers currently comprise 4% of new BigLaw hires, and they are “more common than just two years ago.”

Inflation, meanwhile, has eroded salaries for even the best paid associates in BigLaw. In 2000, NALP reported a median salary of $125,000 for graduates joining firms that employed more than 500 lawyers. Adjusting for inflation, that would be $166,662 for the Class of 2012. BigLaw associates won’t starve on the median $160,000 they’re actually earning, but they’re taking home less in real dollars than the associates who started at the turn of the century.

For associates joining the second tier of BigLaw, firms that employ 251-500 lawyers, the salary news is even worse. In 2000, those associates also reported a median salary of $125,000, which would translate to $166,662 today. The actual median, however, appears to be $145,000 (the same figure reported for 2011). That’s a decline of 13% in real dollars.

10. It goes almost without saying that these 2012 graduates paid much more for their law school education than students did in 1991, 2000, or almost any other year. Law school tuition has far outpaced inflation over the last three decades. It’s scant comfort to this class–or to the classes of 2010, 2011, or 2013–that heavy discounts are starting to ease tuition. These are classes that bought very high and are selling very low. There’s little that law schools can do to make the difference up to these graduates, but we shouldn’t forget the financial hardship they face. If nothing else, the tuition-jobs gap for these classes should make us commit to the boldest possible reforms of legal education.

, View Comments (2)

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests