For the last two weeks, Michael Simkovic and I have been discussing the manner in which law schools used to publish employment and salary information. The discussion started here and continued on both that blog and this one. The debate, unfortunately, seems to have confused some readers because of its historical nature. Let’s clear up that confusion: We were discussing practices that, for the most part, ended four or five years ago.
Responding to both external criticism and internal reflection, today’s law schools publish a wealth of data about their employment outcomes; most of that information is both user-friendly and accurate. Here’s a brief tour of what data are available today and what the future might still hold.
ABA Reports
For starters, all schools now post a standard ABA form that tabulates jobs in a variety of categories. The ABA also provides this information on a website that includes a summary sheet for each school and a spreadsheet compiling data from all of the ABA-accredited schools. Data are available for classes going back to 2010; the 2014 data will appear shortly (and are already available on many school sites).
Salary Specifics
The ABA form does not include salary data, and the organization warns schools to “take special care” when reporting salaries because “salary data can so easily be misleading.” Schools seem to take one of two approaches when discussing salary data today.
Some provide almost no information, noting that salaries vary widely. Others post their “NALP Report” or tables drawn directly from that report. What is this report? It’s a collection of data that law schools have been gathering for about forty years, but not disclosing publicly until the last five. The NALP Report for each school summarizes the salary data that the school has gathered from graduates and other sources. You can find examples by googling “NALP Report” along with the name of a law school. NALP reports are available later in the year than ABA ones; you won’t find any 2014 NALP Reports until early summer.
NALP’s data gathering process is far from perfect, as both Professor Simkovic and I have discussed. The report for each school, however, has the virtue of both providing some salary information and displaying the limits of that information. The reports, for example, detail how many salaries were gathered in each employment category. If a law school reports salaries for 19/20 graduates working for large firms, but just 5/30 grads working in very small firms, a reader can make note of that fact. Readers also get a more complete picture of how salaries differ between the public and private sector, as well as within subsets of those groups.
Before 2010, no law school shared its NALP Report publicly. Instead, many schools chose a few summary statistics to disclose. A common approach was to publish the median salary for a particular law school class, without further information about the process of obtaining salary information, the percentage of salaries gathered, or the mix of jobs contributing to the median. If more specific information made salaries look better, schools could (and did) provide that information. A school that placed a lot of graduates in judicial clerkships, government jobs, or public interest positions, for example, often would report separate medians for those categories–along with the higher median for the private sector. Schools had a lot of discretion to choose the most pleasing summary statistic, because no one reported more detailed data.
Given the brevity of reported salary data, together with the potential for these summary figures to mislead, the nonprofit organization Law School Transparency (LST) began urging schools to publish their “full” NALP Reports. “Full” did not mean the entire report, which can be quite lengthy and repetitive. Instead, LST defined the portions of the report that prospective students and others would find helpful. Schools seem to agree with LST’s definition, publishing those portions of the report when they choose to disclose the information.
Today, according to LST’s tracking efforts, at least half of law schools publish their NALP Reports. There may be even more schools that do so; although LST invites ongoing communication with law schools, the schools don’t always choose to update their status for the LST site.
Plus More
The ABA’s standardized employment form, together with greater availability of NALP Reports, has greatly changed the information available to potential law students and other interested parties. But the information doesn’t stop with these somewhat dry forms. Many law schools have built upon these reports to convey other useful information about their graduates’ careers. Although I have not made an exhaustive review, the contemporary information I’ve seen seems to comply with our obligation to provide information that is “complete, accurate and not misleading to a reasonable law school student or applicant.”
In addition to these efforts by individual schools, the ABA has created two websites with consumer information about law schools: the employment site noted above and a second site with other data regularly reported to the ABA. NALP has also increased the amount of data it releases publicly without charge. LST, finally, has become a key source for prospective students who want to sort and compare data drawn from all of these sources. LST has also launched a new series of podcasts that complement the data with a more detailed look at the wide range of lawyers’ work.
Looking Forward
There’s still more, of course, that organizations could do to gather and disseminate data about legal careers. I like Professor Simkovic’s suggestion that the Census Bureau expand the Current Population Survey and American Community Survey to include more detailed information about graduate education. These surveys were developed when graduate education was relatively uncommon; now that post-baccalaureate degrees are more common, it seems critical to have more rigorous data about those degrees.
I also hope that some scholars will want to gather data from bar records and other online sources, as I have done. This method has limits, but so do larger initiatives like After the JD. Because of their scale and expense, those large projects are difficult to maintain–and without regular maintenance, much of their utility falls.
Even with projects like these, however, law schools undoubtedly will continue to collect and publish data about their own employment outcomes. Our institutions compete for students, US News rank, and other types of recognition. Competition begets marketing, and marketing can lead to overstatements. The burden will remain on all of us to maintain professional standards of “complete, accurate and not misleading” information, even as we talk with pride about our schools. Our graduates face similar obligations when they compete for clients. Although all of us chafe occasionally at duties, they are also the mark of our status as professionals.
Students and practitioners sometimes criticize law professors for knowing too little about the real world. Often, those criticisms are overstated. But then a professor like Michael Simkovic says something so clueless that you start to wonder if the critics are right.
Salaries and Response Rates
In a recent post, Simkovic tries to defend a practice that few other legal educators have defended: reporting entry-level salaries gathered through the annual NALP process without disclosing response rates to the salary question. Echoing a previous post, Simkovic claims that this practice was “an uncontroversial and nearly universal data reporting practice, regularly used by the United States Government.”
Simkovic doesn’t seem to understand how law schools and NALP actually collect salary information; the process is nothing like the government surveys he describes. Because of the idiosyncracies of the NALP process, the response rate has a particular importance.
Here are the two keys to the NALP process: (1) law schools are allowed–even encouraged–to supplement survey responses with information obtained from third parties; and (2) NALP itself is one of those third parties. Each year NALP publishes an online directory with copious salary information about the largest, best-paying law firms. Smaller firms rarely submit information to NALP, so they are almost entirely absent from the Directory.
As a result, as NALP readily acknowledges, “salaries for most jobs in large firms are reported” by law schools, while “fewer than half the salaries for jobs in small law firms are reported.” That’s “reported” as in “schools have independent information about large-firm salaries.”
For Example
To see an example of how this works in practice, take a look at the most recent (2013) salary report for Seton Hall Law School, where Simkovic teaches. Ten out of the eleven graduates who obtained jobs in firms with 500+ lawyers reported their salaries. But of the 34 graduates who took jobs in the smallest firms (those with 2-10 lawyers), just nine disclosed a salary. In 2010, 2011, and 2012, no graduates in the latter category reported a salary.
If this were a government survey, the results would be puzzling. The graduates working at the large law firms are among those “high-income individuals” that Simkovic tells us “often value privacy and are reluctant to share details about their finances.” Why are they so eager to disclose their salaries, when graduates working at smaller (and lower-paying) firms are not? And why do the graduates at every other law school act the same way? The graduates of Chicago’s Class of 2013 seem to have no sense of privacy: 149 out of 153 graduates working in the private sector happily provided their salaries, most of which were $160,000.
The answer, of course, is the NALP Directory. Law schools don’t need large-firm associates to report their salaries; the schools already know those figures. The current Directory offers salary information for almost 800 offices associated with firms of 200+ lawyers. In contrast, the Directory includes information about just 14 law firms employing 25 or fewer attorneys. That’s 14 nationwide–not 14 in New Jersey.
For the latter salaries, law schools must rely upon graduate reports, which seem difficult to elicit. When grads do report these salaries, they are much lower than the BigLaw ones. At Seton Hall, the nine graduates who reported small-firm salaries yielded a mean of just $51,183.
What Was the Problem?
I’m able to give detailed data in the above example because Seton Hall reports all of that information. It does so, moreover, for years going back to 2010. Other schools have not always been so candid. In the old days, some law schools merged the large-firm salaries provided by NALP with a handful of small-firm salaries collected directly from graduates. The school would then report a median or mean “private practice salary” without further information.
Was this “an uncontroversial and nearly universal data reporting practice, regularly used by the United States Government”? Clearly not–unless the government keeps a list of salaries from high-paying employers that it uses to supplement survey responses. That would be a nifty way to inflate wage reports, but no political party seems to have thought of this just yet.
Law schools, in other words, were not just publishing salary information without disclosing response rates. They were disclosing information that they knew was biased: they had supplemented the survey information with data drawn from the largest firms. The organization supervising the data collection process acknowledged that the salary statistics were badly skewed; so did any dean I talked with during that period.
The criticism of law schools for “failing to report response rates” became a polite shorthand for describing the way in which law schools produced misleading salary averages. Perhaps the critics should have been less polite. We reasoned, however, that if law schools at least reported the “response” rates (which, of course, included “responses” provided by the NALP data), graduates would see that reported salaries clustered in the largest firms. The information would also allow other organizations, like Law School Transparency to explain the process further to applicants.
This approach gave law schools the greatest leeway to continue reporting salary data and, frankly, to package it in ways that may still overstate outcomes. But let’s not pretend that law schools have been operating social science surveys with an unbiased method of data collection. That wasn’t true in the past, and it’s not true now.
Earlier this week, I noted that even smart academics are misled by the manner in which law schools traditionally reported employment statistics. Steven Solomon, a very smart professor at Berkeley’s law school, was misled by the “nesting” of statistics on NALP’s employment report for another law school.
Now Michael Simkovic, another smart law professor, has proved the point again. Simkovic rather indignantly complains that Kyle McEntee “suggests incorrectly that The New York Times reported Georgetown’s median private sector salary without providing information on what percentage of the class or of those employed were working in the private sector.” But it is Simkovic who is incorrect–and, once again, it seems to be because he was misled by the manner in which law schools report some of their employment and salary data.
Response Rates
What did McEntee say that got Simkovic so upset? McEntee said that a NY Times column (the one authored by Solomon) gave a median salary for Georgetown’s private sector graduates without telling readers “the response rate.” And that’s absolutely right. The contested figures are here on page two. You’ll see that 362 of Georgetown’s 2013 graduates took jobs in the private sector. That constituted 60.3% of the employed graduates. You’ll also see a median salary of $160,000. All of that is what Solomon noted in his Times column (except that he confused the percentage of employed graduates with the percentage of the graduating class).
The fact that Solomon omitted, and that McEntee properly highlighted, is the response rate for the number of graduates who reported those salaries. That number appears clearly on the Georgetown report, in the same line as the other information: 362 graduates obtained these private sector jobs, but only 293 of them disclosed salaries for those jobs. Salary information was unavailable for about one-fifth of the graduates holding these positions.
Why does this matter? If you’ve paid any attention to the employment of law school graduates, the answer is obvious. NALP acknowledged years ago that reported salaries suffer from response bias. To see an illustration of this, take a look at the same Georgetown report we’ve been examining. On page 4, you’ll see that salaries were known for 207 of the 211 graduates (98.1%) working in the largest law firms. For graduates working in the smallest category of firms, just 7 out of 27 salaries (25.9%) were available. For public interest jobs that required bar admission, just 15 out of 88 salaries (17.0%) were known.
Simkovic may think it’s ok for Solomon to discuss medians in his Times column without disclosing the response rate. I disagree–and I think a Times reporter would as well. Respected newspapers are more careful about things like response rates. But whether or not you agree with Solomon’s writing style, McEntee is clearly right that he omitted the response rate on the data he discussed.
So Simkovic, like Solomon, seems to be confused by the manner in which law schools report information on NALP forms. 60% of the employed graduates held private sector jobs, but that’s not the response rate for salaries. And there’s a pretty strong consensus that the salary responses on the NALP questionnaire are biased–even NALP thinks so.
Misleading By Omission
The ABA’s standard employment report has brought more clarity to reporting entry-level employment outcomes. Solomon and Simkovic were not confused by data appearing on that form, but by statistics contained in NALP’s more outmoded form. Once again, their errors confirm the problems in old reporting practices.
More worrisome than this confusion, Solomon and Simkovic both adopt a strategy that many law schools followed before the ABA intervened: they omit information that a reader (or potential student) would find important. The most mind-boggling fact about Georgetown’s 2013 employment statistics is that the school itself hired 83 of its graduates–12.9% of the class. For 80 of those graduates, Georgetown provided a full year of full-time employment.
Isn’t that something you would want to know in evaluating whether “[a]t the top law schools, things are returning to the years before the financial crisis”? That’s the lead in to Solomon’s up-beat description of Georgetown’s employment statistics–the description that then neglects to mention how many of the graduates’ jobs were funded by their own law school.
I’m showing my age here, but back in the twentieth century, T14 schools didn’t fund jobs for one out of every eight graduates. Nor was that type of funding common in those hallowed years more immediately preceding the financial crisis.
I’ll readily acknowledge that Georgetown funds more graduate jobs than most other law schools, but the practice exists at many top schools. It’s Solomon who chose Georgetown as his example. Why are he and Simkovie then so silent about these school-funded jobs?
Final Thoughts
I ordinarily wouldn’t devote an entire post to a law professor’s errors in reading an employment table. We all make too many errors for that to be newsworthy. But Simkovic is so convinced that law schools have never misled anyone with their employment statistics–and here we have two examples of smart, knowledgeable people misled by those same statistics.
Speaking of which, Simkovic defends Solomon’s error by suggesting that he “simply rounded up” from 56% to 60% because four percent is a “small enough difference.” Rounded up? Ask any law school dean whether a four-point difference in an employment rate matters. Or check back in some recent NALP reports. The percentage of law school graduates obtaining nine-month jobs in law firms fell from 50.9% in 2010 to 45.9% in 2011. Maybe we could have avoided this whole law school crisis thing if we’d just “rounded up” the 2011 number to 50%.
Some legal educators have a New Yorker’s view of the world. Like the parochial Manhattanite in Saul Steinberg’s famous illustration, these educators don’t see much beyond their own fiefdom. They see law graduates out there in the world, practicing their profession or working in related fields. And there are doctors, who (regrettably) make more money than lawyers do. But really, what else is there? What do people do if they don’t go to law school?
Michael Simkovic takes this position in a recent post, declaring (in bold) that: “The question everyone who decides not to go to law school . . . must answer is–what else out there is better?” In a footnote, Simkovic concedes that “[a]nother graduate degree might be better than law school for a particular individual,” but he clearly doesn’t think much of the idea.
People, of course, work in hundreds of occupations other than law. Some of them even enjoy their work. Simkovic’s concern lies primarily with the financial return on college and graduate degrees. Even here, though, the contemporary options are much broader than many legal educators realize.
Time Was: The 1990s
Financially, the late twentieth century was a good time to be a lawyer. When the Bureau of Labor Statistics (BLS) published its first Occupational Employment Statistics (OES) in 1997, the four occupations with the highest salaries were medicine, dentistry, podiatry, and law. Those four occupations topped the salary list (in that order) whether sorted by mean or median salary. [Note that OES collects data only on salaries; it does not include self-employed individuals like solo practitioners or partners–whether in law or medicine. For more on that point, see the end of this post.]
Law was a pretty good deal in those days. The graduate program was just three years, rather than four. There were no college prerequisites and no post-graduate internships. Knowledge of math was optional, and exposure to bodily fluids minimal. Imagine earning a median salary of $109,987 (in 2014 dollars) without having to examine feet! Although a willingness to spend four years of graduate school studying feet, along with a lifetime of treating them, would have netted you a 28% increase in median salary.
But let’s not dally any longer in the twentieth century.
Time Is: 2014
BLS just released its latest survey of occupational wages, and the results show how much the economy has changed. Law practice has slipped to twenty-second place in a listing of occupations by mean salary, and twenty-sixth place when ranked by median. One subset of lawyers, judges and magistrates, holds twenty-fifth place on the list of median salaries, but practicing lawyers have slipped a notch lower.
About half the slippage in law’s salary prominence stems from the splintering of medical occupations, both in the real world and as measured by BLS. We no longer visit “doctors,” we see pediatricians, general practitioners, internists, obstetricians, anesthesiologists, surgeons, and psychiatrists–often in that order. These medical specialists, along with the dentists and podiatrists, all enjoy a higher median salary than lawyers.
There are two other health-related professions, meanwhile, that have moved ahead of lawyers in wages: nurse anesthetists and pharmacists. Both of these fields require substantial graduate education: at least two years for nurse anesthetists and two to four years for pharmacists. But the training pays off with a median salary of $153,780 for nurse anesthetists and $120,950 for pharmacists.
Today’s college graduates, furthermore, don’t have to deal with teeth, airways, or medications to earn more than lawyers do. The latest BLS survey includes nine other occupations that top lawyers’ median salary: financial managers, airline pilots, natural sciences managers, air traffic controllers, marketing managers, computer and information systems managers, petroleum engineers, architectural and engineering managers, and chief executives.
How much do salaried lawyers earn in their more humble berth on the OES list? They collected a median salary of $114,970 in 2014. That’s good, but it’s only 4.5% higher (in inflation-controlled dollars) than the median salary in 1997. Pharmacists enjoyed a whopping 28% increase in median real wages to reach $120,950 in 2014. And the average nurse anesthetist earned a full third more than the average lawyer that year.
If you’re a college student willing to set your financial sights just a bit lower than the median salary in law practice, there are lots of other options. Here are some of the occupations with a 2014 median salary falling between $100,000 and $114,970: sales manager, physicist, computer hardware engineer, computer and information research scientist, compensation and benefits manager, purchasing manager, astronomer, aerospace engineer, political scientist, mathematician, software developer for systems software, human resources manager, training and development manager, public relations and fundraising manager, optometrist, nuclear engineer, and prosthodontist (those are the folks who will soon be fitting baby boomers for their false teeth).
Law graduates could apply their education to some of these jobs; with a few more years of graduate education, a savvy lawyer could offer the aging boomers a package deal on a will and a new pair of choppers. But the most common themes in these salary-leading occupations do not revolve around law. Instead, the themes are math, science, and management–none of which we teach very well in law school.
Twenty-first Century Humility
Lawyers will not disappear. Even Richard Susskind, who asked about “The End of Lawyers?” in a provocative book title, doesn’t think lawyers are done for. We still need lawyers to fill both traditional roles and new ones. Lawyers, however, will not have the same economic and social dominance that they enjoyed in the late twentieth century.
Some lawyers will still make a lot of money. As the American Lawyer proclaimed last year, the “super rich” are getting richer. But the prospects for other lawyers are less certain, and the appeal of competing fields has increased.
If law schools want to understand their decline in talented applicants, they need to look more closely at the competition. What do today’s high school students and middle schoolers think about law? Those students will choose their majors soon after arriving at college. Once they choose engineering, computer science, business, or health-related courses, a legal career will seem even less appealing. If we want potential students to find law attractive, we need to know more about their alternatives and preferences.
We also need to be realistic about how many students ultimately will–or should–pursue a law degree. As citizens of a healthy economy, we need doctors, nurse anesthetists, pharmacists, managers, and software developers. We even need the odd astronomer or two. Law is just one of the many occupations that make a society thrive. The twenty-first century is a time of interdependence that should bring a sense of humility.
Notes
Here are some key points about the method behind the OES survey. For more information, see this FAQ page, which includes the information I summarize here:
1. OES obtains wage data directly from establishments. This method eliminates bias that may occur when individuals report their own wages. The survey, however, includes only wage data for salaried employees. Solo practitioners (in any field) are excluded, as are individuals who draw their income entirely from partnerships or other forms of profit sharing.
2. “Wages” include production bonuses and tips, but not end-of-year bonuses, profit-sharing, or benefits.
3. Although BLS publishes OES data every year, the data are gathered on a rolling basis. Income for “1997” or “2014” reflects data gathered over three years, including the reference year. BLS adjusts wage figures for the two older years, using the Employment Cost Index, so the reported wages appear in then “current” dollars. The three-year collection period, however, can mask sudden shifts in employment trends.
4. BLS cautions against using OES data to compare changes in employment data over time, unless the user offers necessary context. In particular, it is important for readers to understand that short-term comparisons are difficult (because of the point in the previous paragraph) and that occupational categories change frequently. For those reasons, I have limited my cross-time comparisons and have noted the splintering of occupational categories. The limited comparison offered here, however, seems helpful in understanding the relationship of law practice to other high-paying occupations.
5. For the data used in this post, follow this link and download the spreadsheets. The HTML versions are prettier, but they do not include all of the data.
What obligations, if any, do academic institutions owe potential students? When soliciting these “customers,” how candid should schools be in discussing graduation rates, scholarship conditions, or the employment outcomes of recent graduates? Do the obligations differ for a professional school that will teach students about the ethics of communicating with their own future customers?
New Marketing/New Concerns
Once upon a time, we marketed law schools with a printed brochure or two. That changed with the advent of the new century and the internet. Now marketing is pervasive: web pages, emails, blog posts, and forums.
With increased marketing, some educators began to worry about how we presented ourselves to students. As a sometime social scientist, I was particularly concerned about the way in which some law schools reported median salaries without disclosing the number of graduates supplying that information. A school could report that it had employment information from 99% of its graduates, that 60% were in private practice, and that the median salary for those private practitioners was $120,000. Nowhere did the reader learn that only 45% of the graduates reported salary information. [This is a hypothetical example; it does not represent any particular law school.]
I also noticed that, although law schools know only the average “amount borrowed” by their students, schools and the media began to represent that figure as the average “debt owed.” Interest, unfortunately, accumulates while a student is in law school, so the “amount borrowed” significantly understates the “debt owed” when loans fall due.
Other educators worried about a lack of candor when schools offered scholarships to students. A school might offer an attractive three-year scholarship to an applicant, with the seemingly easy condition that the student maintain a B average. The school knew that it tightly controlled curves in first-year courses, so that a predictable number of awardees would fail that condition, but the applicants didn’t understand that. This isn’t just a matter of optimism bias; undergraduates literally do not understand law school curves. A few years ago, one law school hopeful said to me: “What’s the big deal about grade competition in law school? It’s not like there’s a limit on the number of A’s or anything.” When I explained the facts of law school life, she went off to pursue a Ph.D. in botany.
And then there was the matter of nested statistics. Schools would report the number of employed graduates, then identify percentages of those graduates working in particular job categories. Categories spawned sub-categories, and readers began to lose sight of the denominator. Even respected scholars like Steven Solomon get befuddled by these statistics. Yesterday, Solomon misinterpreted Georgetown’s 2013 employment statistics due to this type of nesting: he mistook 60% of employed graduates for 60% of the graduating class. (Georgetown, to its credit, provides clearer statistics on a different page than the one Solomon used.)
Educators, of course, weren’t the only ones who noticed these problems. We were slow–much too slow–to address our lapses, and we suffered legitimate criticism from the media and organizations like Law School Transparency. Indeed, the criticisms continue, as professors persist in making misleading statements.
For me, these are ethical issues. I believe that educators do have a special obligation to prospective students; they are not just “customers,” they are people who depend upon us for instruction and wise counsel. At law schools, prospective students are also future colleagues in the legal profession; even while we teach, we are an integral part of the profession.
With that in mind, I communicate with prospective students as I would talk to a colleague asking about an entry-level teaching position or a potential move to another school. I tell students what I would want to know if I were in their position. And, consistent with my role as a teacher and scholar, I try to present the information in a manner that is straightforward and easy to understand. For the last few years, most law schools have followed the same golden rules–albeit with considerable prodding from Law School Transparency, the ABA, and the media.
Revisionist History
Now that law schools have become more careful in their communications with potential students, revisionist history has appeared. Ignoring all of the concerns discussed above (although they appear in sources he cites), Michael Simkovic concludes that “The moral critique against law schools comes down to this: The law schools used the same standard method of reporting data as the U.S. Government.”
Huh? When the government publishes salaries in SIPP, a primary source for Simkovic’s scholarship, I’m pretty sure they disclose how many respondents refused to provide that information. Reports on the national debt, likewise, include interest accrued rather than just the original amounts borrowed–although I will concede that there’s plenty of monkey business in that reporting. I’ll also concede that welfare recipients probably don’t fully understand the conditions in the contracts they sign.
Simkovic, of course, doesn’t mean to set the government up as a model on these latter points. Instead, he ignores those issues and pretends that the ethical critique of law schools focused on just one point: calculation of the overall employment rate. On this, Simkovic has good news for law schools: they can ethically count a graduate as employed as long as the graduate was paid for a single hour of work during the reporting week–because that’s the way the government does it.
I don’t think any law school has ever been quite that audacious, and the ABA certainly would not approve. The implications of Simkovic’s argument, however, illuminate a key point: law schools communicate for a different purpose, and to a different audience, than the Bureau of Labor Statistics. The primary consumers of our employment statistics are current and potential students. We draft our employment statistics for that audience, and the information should be tailored to them.
As for scholarship, I will acknowledge that the U.S. government owns the word “unemployment.” I used a non-standard definition of that concept in a recent paper, and clearly designated it as such. But this seems to distract some readers, so I’ll refer to those graduates as “not working.” I suspect it’s all the same to them.
What is the Bureau of Labor Statistics (BLS), and what can it do for you? The BLS is an independent statistical agency that measures “labor market activity, working conditions, and price changes in the economy.” You’ve sampled BLS wares if you’ve relied upon the Consumer Price Index, unemmployment rates, or average wages.
One program within BLS tries to project employment growth for hundreds of different occupations. The Bureau issues these forecasts every two years, with each projection spanning a decade. The most recent projections, released in December 2013, attempt to forecast occupational growth between 2012 and 2022.
Why does BLS spend your tax dollars trying to do this? Most parents can’t predict what their teenagers will do next week. How does the BLS think it can predict the behavior of an entire economy, including growth rates in so many different occupations?
The truth is that it can’t, at least not with the level of accuracy that some users would like. There are just too many variables, not to mention acts of god and war. The latest evaluation of BLS’s occupational projections found that, when BLS projected occupational growth rates between 1996 and 2006, it failed to foresee the following:
* Immigration would be higher than the Census Bureau predicted
* Women’s labor force participation would decline
* Terrorists would hijack 4 jets, level the WTC, and damage the Pentagon
* The United States would go to war with both Afganistan and Iraq
* A housing bubble would double home prices over the decade
* Internet-based services would cut the number of travel agents by a third
It was a tumultuous decade, but so are most decades. Given the twists and turns of human history, which affect the type of work that humans do, why does BLS even bother with occupational projections?
Better Than the Alternatives
Like democracy, BLS’s projections seem to be better than the alternatives. In particular, these forecasts are better than ones that rely solely on historical trends. In 2010, the Bureau tested its model against four different “naive models” that drew solely on historical data. A common naive model (and one that the Bureau tested) predicts each occupation’s growth rate based on that occupation’s rate of growth during the previous 10 years. Another variation, also tested by the Bureau, uses the most recent five years to project future growth.
On three out of four measures, the Bureau’s predictions outperformed all of the naive models. Predicting the future is difficult, especially when that future includes human actions. The Bureau’s experience, however, suggests that past performance is not the best guide to occupational growth; adding other ingredients to the forecast improves information.
Who Needs It?
Even if BLS predictions are better than naive models, who needs these predictions? Why engage in such an imprecise exercise? BLS began projecting occupational growth after World War II in order to help returning veterans identify promising career paths. The program persisted as a way to serve “individuals seeking career guidance,” as well as “policymakers, community planners, and educational authorities who need information for long-term policy planning purposes.”
If BLS wants students to use its occupational projections for “career guidance,” then why does it warn against using the projections to predict labor shortages or surpluses? Don’t students examine these projections precisely to determine which occupations are growing and which ones are declining? How is occupational “growth” different from a labor “shortage” in that occupation?
The two concepts are related, yet different. Remember that BLS projects (however imperfectly) the number of people who will actually fill an occupation a decade later. The Bureau doesn’t estimate how many people will want to work in that field or how many will prepare to do so; that’s not its task. The Bureau also assumes that the labor market will “clear.” In other words, if demand falls for workers in a particular field, those workers will go elsewhere. They won’t simply hang around the edges of the occupation, constituting a surplus labor supply.
This doesn’t mean, however, that the number of workers preparing to enter an occupation is irrelevant to predicting job and salary prospects for that occupation. If the pipeline of aspiring workers is easy to quantify, and if the occupation itself is tightly defined, then comparing the worker supply to job projections can yield useful information. If labor supply greatly exceeds likely job openings, then one of three things are likely to happen: (1) some of the workers will take other jobs; (2) wages in the occupation will decline; or (3) both.
What About Law?
The worker pipeline is relatively easy to specify in law. Almost no one becomes a lawyer without obtaining a JD, and there is evidence (p. 72) that most law graduates want to practice law at least for a while. The occupation itself is also well defined. Law graduates can apply their education to a range of law-related jobs, but there is widespread consensus on which jobs are “lawyering” jobs that require bar admission. These are the same jobs that graduates, on the whole, prefer.
Under those conditions, it is useful to compare the number of law school graduates to projected job openings for lawyers. That is what I did several years ago. At that time, the number of students progressing through the law school pipeline greatly exceeded the number of lawyering positions that BLS projected. A substantial number of those graduates, I predicted, would have to find work outside of law practice. Wages for entry-level lawyers might also fall.
That is, in fact, what happened. My recent study of new lawyers admitted to the Ohio bar confirms that, four and a half years after graduation, one quarter of licensed lawyers were working in jobs that did not require a law degree. After accounting for graduates who didn’t take or pass the bar exam, it appears that a full third of recent law school graduates are not practicing as lawyers.
The good news is that my study suggests there may be more job openings for lawyers than BLS projected. Not enough to satisfy all of the graduates who want those jobs, but more than BLS estimated.
Meanwhile, there is also evidence that wages have declined for entry-level lawyers. The median starting salary reported to NALP for the Class of 2008 was $72,000; five years later, the median reported salary for the Class of 2013 was $62,467. The comparison looks even worse after adjusting for inflation: If the median wage had remained at the 2008 level, it would have reached almost $78,000 by 2013. The real median wage for new lawyers fell by 19.8% over those five years.
Will law graduates who were unable to find a lawyering job find satisfaction in other jobs? They might; probably some will and some won’t. Will they prosper financially from their law degree, regardless of occupation? They might, if historical patterns hold. To the extent their wage losses represent effects of the recession, will they make up those differences later in their careers? Again, they might if historical patterns hold. But for students investing more than $100,000 in a legal education, it’s worth considering as much information as possible. That includes BLS projections for their desired occupation.
These projections are also useful–when combined with other available information–for legal educators to consider. The career prospects of our graduates should inform the educational programs we design, as well as the information we offer potential applicants. BLS projections represent only a small piece of this puzzle, but they offer one perspective on how the labor market for lawyers is performing.
What About Those New Projections?
The BLS recently changed the way in which it measures occupational “separations.” That’s an estimate of the number of people who will leave a particular occupation. This measure, in turn, affects the projection of job openings; when a worker leaves an occupation, that departure often creates a job opening. Under this new method, BLS will project more lawyering jobs than it did in the past. That sounds like good news for aspiring lawyers, and it is–in part. The change also reveals some unsettling trends in our profession, which I’ll explore in a future post.
Cafe Manager & Co-Moderator
Deborah J. Merritt
Cafe Designer & Co-Moderator
Kyle McEntee
Law School Cafe is a resource for anyone interested in changes in legal education and the legal profession.
Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.