Earlier this week, I wrote about the progress that law schools have made in reporting helpful employment statistics. The National Association for Law Placement (NALP), unfortunately, has not made that type of progress. On Wednesday, NALP issued a press release that will confuse most readers; mislead many; and ultimately hurt law schools, prospective students, and the profession. It’s the muddled, the false, and the damaging.
The Muddled
Much of the press release discusses the status of $160,000 salaries for new lawyers. This discussion vacillates between good news (for the minority of graduates who might get these salaries) and bad news. On the one hand, the $160,000 starting salary still exists. On the other hand, the rate hasn’t increased since 2007, producing a decline of 11.7% in real dollars (although NALP doesn’t spell that out).
On the bright side, the percentage of large firm offices paying this salary has increased from 27% in 2014 to 39% this year. On the down side, that percentage still doesn’t approach the two-thirds of large-firm offices that paid $160,000 in 2009. It also looks like the percentage of offices offering $160,000 to this fall’s associates (“just over one-third”) will be slightly lower than the current percentage.
None of this discussion tells us very much. This NALP survey focused on law firms, not individuals, and it tabulated results by office rather than firm. The fact that 39% of offices associated with the largest law firms are paying $160,000 doesn’t tell us how many individuals are earning that salary (let alone what percentage of law school graduates are doing so). And, since NALP has changed its definition of the largest firms since 2009, it’s hard to know what to make of comparisons with previous years.
In the end, all we know is that some new lawyers are earning $160,000–a fact that has been true since 2007. We also know that this salary must be very, very important because NALP repeats the figure (“$160,000”) thirty-two times in a single press release.
The False
In a bolded heading, NALP tells us that its “Data Represent Broad-Based Reporting.” This is so far off the mark that it’s not even “misleading.” It’s downright false. As the press release notes, only 5% of the firms responding to the survey employed 50 lawyers or fewer. (The accompanying table suggests that the true percentage was just 3.5%, but I won’t quibble over that.)
That’s a laughable representation of small law firms, and NALP knows it. Last year, NALP reported that 57.5% of graduates who took jobs with law firms went to firms of 50 lawyers or less. Smaller firms tend to hire fewer associates than large ones, and they don’t hire at all in some years. The percentage of “small” firms (those with 50 or fewer lawyers) in the United States undoubtedly is greater than 57.5%–and not anywhere near 5%.
NALP’s false statements go beyond a single heading. The press release specifically assures readers that “The report thus sheds light on the breadth of salary differentials among law firms of varying sizes and in a wide range of geographic areas nationwide, from the largest metropolitan areas to much smaller cities.” I don’t know how anyone can make that claim with a straight face, given the lack of response from law firms that make up the majority of firms nationwide.
This would be simply absurd, except NALP also tells readers that “the overall national median first-year salary at firms of all sizes was $135,000,” and that the median for the smallest firms (those with 50 or fewer lawyers) was $121,500. There is some fuzzy language about the median moving up during the last year because of “relatively fewer responses from smaller firms,” but that refers simply to the incremental change. Last year’s survey was almost as distorted as this year’s, with just 9.8% of responses coming from firms with 50 or fewer lawyers.
More worrisome, there’s no caveat at all attached to the representation that the median starting salary in the smallest law firms is $121,500. If you think that the 16 responding firms in this category magically represented salaries of all firms with 50 or fewer lawyers, see below. Presentation of the data in this press release as “broad-based” and “shed[ding] light on the breadth of salary differentials” is just breathtakingly false.
The Damaging
NALP’s false statements damage almost everyone related to the legal profession. The media have reported some of the figures from the press release, and the public response is withering. Clients assume that firms must be bilking them; otherwise, how could so many law firms pay new lawyers so much? Remember that this survey claims a median starting salary of $121,500 even at the smallest firms. Would you approach a law firm to draft your will or handle your divorce if you thought your fees would have to support that type of salary for a brand-new lawyer?
Prospective students will also be hurt if they act on NALP’s misrepresentations. Why shouldn’t they believe an organization called the “National Association for Law Placement,” especially when the organization represents its data as “broad-based”?
Ironically, though, law schools may suffer the most. What happens when prospective students compare NALP’s pumped-up figures with the ones on most of our websites? Nationwide, the median salary for 2013 graduates working in firms of 2-10 lawyers was just $50,000. So far, reports about the Class of 2014 look comparable. (As I’ve explained before, the medians that NALP reports for small firms are probably overstated. But let’s go with the reported median for now.)
When prospective students look at most law school websites, they’re going to see that $50,000 median (or one close to it) for small firms. They’re also going to see that a lot of our graduates work in those small firms of 2-10 lawyers. Nationwide, 8,087 members of the Class of 2013 took a job with one of those firms. That’s twice as many small firm jobs as ones at firms employing 500+ lawyers (which hired 3,980 members of the Class of 2013).
How do we explain the fact that so many of our graduates work at small firms, when NALP claims that these firms represent such a small percentage of practice? And how do we explain that our graduates average only $50,000 in these small-firm jobs, while NALP reports a median of $121,500? And then how do we explain the small number of our graduates who earn this widely discussed salary of $160,000?
With figures like $160,000 and $121,500 dancing in their heads, prospective students will conclude that most law schools are losers. By “most” I mean the 90% of us who fall outside the top twenty schools. Why would a student attend a school that offers outcomes so inferior to ones reported by NALP?
Even if these prospective students have read scholarly analyses showing the historic value of a law degree, they’re going to worry about getting stuck with a lemon school. And compared to the “broad-based” salaries reported by NALP, most of us look pretty sour.
Law schools need to do two things. First, we need to stop NALP from making false statements–or even just badly skewed ones. Each of our institutions pays almost $1,000 per year for this type of reporting. We shouldn’t support an organization that engages in such deceptive statements.
Second, we really do need to stop talking about BigLaw and $160,000 salaries. If Michael Simkovic and Frank McIntyre are correct about the lifetime value of a law degree, then we should be able to illustrate that value with real careers and real salaries. What do prosecutors earn compared to other government workers, both entry-level and after 20 years of experience? How much of a premium do businesses pay for a compliance officer with a JD? We should be able to generate answers to those questions. If the answers are positive, and we can place students in the appropriate jobs, we’ll have no trouble recruiting applicants.
If the answers are negative, we need to know that as well. We need to figure out the value of our degree, for our students. Let’s get real. Stop NALP from disseminating falsehoods, stop talking about $16*,*** salaries, and start talking about outcomes we can deliver.
For the last two weeks, Michael Simkovic and I have been discussing the manner in which law schools used to publish employment and salary information. The discussion started here and continued on both that blog and this one. The debate, unfortunately, seems to have confused some readers because of its historical nature. Let’s clear up that confusion: We were discussing practices that, for the most part, ended four or five years ago.
Responding to both external criticism and internal reflection, today’s law schools publish a wealth of data about their employment outcomes; most of that information is both user-friendly and accurate. Here’s a brief tour of what data are available today and what the future might still hold.
ABA Reports
For starters, all schools now post a standard ABA form that tabulates jobs in a variety of categories. The ABA also provides this information on a website that includes a summary sheet for each school and a spreadsheet compiling data from all of the ABA-accredited schools. Data are available for classes going back to 2010; the 2014 data will appear shortly (and are already available on many school sites).
Salary Specifics
The ABA form does not include salary data, and the organization warns schools to “take special care” when reporting salaries because “salary data can so easily be misleading.” Schools seem to take one of two approaches when discussing salary data today.
Some provide almost no information, noting that salaries vary widely. Others post their “NALP Report” or tables drawn directly from that report. What is this report? It’s a collection of data that law schools have been gathering for about forty years, but not disclosing publicly until the last five. The NALP Report for each school summarizes the salary data that the school has gathered from graduates and other sources. You can find examples by googling “NALP Report” along with the name of a law school. NALP reports are available later in the year than ABA ones; you won’t find any 2014 NALP Reports until early summer.
NALP’s data gathering process is far from perfect, as both Professor Simkovic and I have discussed. The report for each school, however, has the virtue of both providing some salary information and displaying the limits of that information. The reports, for example, detail how many salaries were gathered in each employment category. If a law school reports salaries for 19/20 graduates working for large firms, but just 5/30 grads working in very small firms, a reader can make note of that fact. Readers also get a more complete picture of how salaries differ between the public and private sector, as well as within subsets of those groups.
Before 2010, no law school shared its NALP Report publicly. Instead, many schools chose a few summary statistics to disclose. A common approach was to publish the median salary for a particular law school class, without further information about the process of obtaining salary information, the percentage of salaries gathered, or the mix of jobs contributing to the median. If more specific information made salaries look better, schools could (and did) provide that information. A school that placed a lot of graduates in judicial clerkships, government jobs, or public interest positions, for example, often would report separate medians for those categories–along with the higher median for the private sector. Schools had a lot of discretion to choose the most pleasing summary statistic, because no one reported more detailed data.
Given the brevity of reported salary data, together with the potential for these summary figures to mislead, the nonprofit organization Law School Transparency (LST) began urging schools to publish their “full” NALP Reports. “Full” did not mean the entire report, which can be quite lengthy and repetitive. Instead, LST defined the portions of the report that prospective students and others would find helpful. Schools seem to agree with LST’s definition, publishing those portions of the report when they choose to disclose the information.
Today, according to LST’s tracking efforts, at least half of law schools publish their NALP Reports. There may be even more schools that do so; although LST invites ongoing communication with law schools, the schools don’t always choose to update their status for the LST site.
Plus More
The ABA’s standardized employment form, together with greater availability of NALP Reports, has greatly changed the information available to potential law students and other interested parties. But the information doesn’t stop with these somewhat dry forms. Many law schools have built upon these reports to convey other useful information about their graduates’ careers. Although I have not made an exhaustive review, the contemporary information I’ve seen seems to comply with our obligation to provide information that is “complete, accurate and not misleading to a reasonable law school student or applicant.”
In addition to these efforts by individual schools, the ABA has created two websites with consumer information about law schools: the employment site noted above and a second site with other data regularly reported to the ABA. NALP has also increased the amount of data it releases publicly without charge. LST, finally, has become a key source for prospective students who want to sort and compare data drawn from all of these sources. LST has also launched a new series of podcasts that complement the data with a more detailed look at the wide range of lawyers’ work.
Looking Forward
There’s still more, of course, that organizations could do to gather and disseminate data about legal careers. I like Professor Simkovic’s suggestion that the Census Bureau expand the Current Population Survey and American Community Survey to include more detailed information about graduate education. These surveys were developed when graduate education was relatively uncommon; now that post-baccalaureate degrees are more common, it seems critical to have more rigorous data about those degrees.
I also hope that some scholars will want to gather data from bar records and other online sources, as I have done. This method has limits, but so do larger initiatives like After the JD. Because of their scale and expense, those large projects are difficult to maintain–and without regular maintenance, much of their utility falls.
Even with projects like these, however, law schools undoubtedly will continue to collect and publish data about their own employment outcomes. Our institutions compete for students, US News rank, and other types of recognition. Competition begets marketing, and marketing can lead to overstatements. The burden will remain on all of us to maintain professional standards of “complete, accurate and not misleading” information, even as we talk with pride about our schools. Our graduates face similar obligations when they compete for clients. Although all of us chafe occasionally at duties, they are also the mark of our status as professionals.
Students and practitioners sometimes criticize law professors for knowing too little about the real world. Often, those criticisms are overstated. But then a professor like Michael Simkovic says something so clueless that you start to wonder if the critics are right.
Salaries and Response Rates
In a recent post, Simkovic tries to defend a practice that few other legal educators have defended: reporting entry-level salaries gathered through the annual NALP process without disclosing response rates to the salary question. Echoing a previous post, Simkovic claims that this practice was “an uncontroversial and nearly universal data reporting practice, regularly used by the United States Government.”
Simkovic doesn’t seem to understand how law schools and NALP actually collect salary information; the process is nothing like the government surveys he describes. Because of the idiosyncracies of the NALP process, the response rate has a particular importance.
Here are the two keys to the NALP process: (1) law schools are allowed–even encouraged–to supplement survey responses with information obtained from third parties; and (2) NALP itself is one of those third parties. Each year NALP publishes an online directory with copious salary information about the largest, best-paying law firms. Smaller firms rarely submit information to NALP, so they are almost entirely absent from the Directory.
As a result, as NALP readily acknowledges, “salaries for most jobs in large firms are reported” by law schools, while “fewer than half the salaries for jobs in small law firms are reported.” That’s “reported” as in “schools have independent information about large-firm salaries.”
For Example
To see an example of how this works in practice, take a look at the most recent (2013) salary report for Seton Hall Law School, where Simkovic teaches. Ten out of the eleven graduates who obtained jobs in firms with 500+ lawyers reported their salaries. But of the 34 graduates who took jobs in the smallest firms (those with 2-10 lawyers), just nine disclosed a salary. In 2010, 2011, and 2012, no graduates in the latter category reported a salary.
If this were a government survey, the results would be puzzling. The graduates working at the large law firms are among those “high-income individuals” that Simkovic tells us “often value privacy and are reluctant to share details about their finances.” Why are they so eager to disclose their salaries, when graduates working at smaller (and lower-paying) firms are not? And why do the graduates at every other law school act the same way? The graduates of Chicago’s Class of 2013 seem to have no sense of privacy: 149 out of 153 graduates working in the private sector happily provided their salaries, most of which were $160,000.
The answer, of course, is the NALP Directory. Law schools don’t need large-firm associates to report their salaries; the schools already know those figures. The current Directory offers salary information for almost 800 offices associated with firms of 200+ lawyers. In contrast, the Directory includes information about just 14 law firms employing 25 or fewer attorneys. That’s 14 nationwide–not 14 in New Jersey.
For the latter salaries, law schools must rely upon graduate reports, which seem difficult to elicit. When grads do report these salaries, they are much lower than the BigLaw ones. At Seton Hall, the nine graduates who reported small-firm salaries yielded a mean of just $51,183.
What Was the Problem?
I’m able to give detailed data in the above example because Seton Hall reports all of that information. It does so, moreover, for years going back to 2010. Other schools have not always been so candid. In the old days, some law schools merged the large-firm salaries provided by NALP with a handful of small-firm salaries collected directly from graduates. The school would then report a median or mean “private practice salary” without further information.
Was this “an uncontroversial and nearly universal data reporting practice, regularly used by the United States Government”? Clearly not–unless the government keeps a list of salaries from high-paying employers that it uses to supplement survey responses. That would be a nifty way to inflate wage reports, but no political party seems to have thought of this just yet.
Law schools, in other words, were not just publishing salary information without disclosing response rates. They were disclosing information that they knew was biased: they had supplemented the survey information with data drawn from the largest firms. The organization supervising the data collection process acknowledged that the salary statistics were badly skewed; so did any dean I talked with during that period.
The criticism of law schools for “failing to report response rates” became a polite shorthand for describing the way in which law schools produced misleading salary averages. Perhaps the critics should have been less polite. We reasoned, however, that if law schools at least reported the “response” rates (which, of course, included “responses” provided by the NALP data), graduates would see that reported salaries clustered in the largest firms. The information would also allow other organizations, like Law School Transparency to explain the process further to applicants.
This approach gave law schools the greatest leeway to continue reporting salary data and, frankly, to package it in ways that may still overstate outcomes. But let’s not pretend that law schools have been operating social science surveys with an unbiased method of data collection. That wasn’t true in the past, and it’s not true now.
Earlier this week, I noted that even smart academics are misled by the manner in which law schools traditionally reported employment statistics. Steven Solomon, a very smart professor at Berkeley’s law school, was misled by the “nesting” of statistics on NALP’s employment report for another law school.
Now Michael Simkovic, another smart law professor, has proved the point again. Simkovic rather indignantly complains that Kyle McEntee “suggests incorrectly that The New York Times reported Georgetown’s median private sector salary without providing information on what percentage of the class or of those employed were working in the private sector.” But it is Simkovic who is incorrect–and, once again, it seems to be because he was misled by the manner in which law schools report some of their employment and salary data.
Response Rates
What did McEntee say that got Simkovic so upset? McEntee said that a NY Times column (the one authored by Solomon) gave a median salary for Georgetown’s private sector graduates without telling readers “the response rate.” And that’s absolutely right. The contested figures are here on page two. You’ll see that 362 of Georgetown’s 2013 graduates took jobs in the private sector. That constituted 60.3% of the employed graduates. You’ll also see a median salary of $160,000. All of that is what Solomon noted in his Times column (except that he confused the percentage of employed graduates with the percentage of the graduating class).
The fact that Solomon omitted, and that McEntee properly highlighted, is the response rate for the number of graduates who reported those salaries. That number appears clearly on the Georgetown report, in the same line as the other information: 362 graduates obtained these private sector jobs, but only 293 of them disclosed salaries for those jobs. Salary information was unavailable for about one-fifth of the graduates holding these positions.
Why does this matter? If you’ve paid any attention to the employment of law school graduates, the answer is obvious. NALP acknowledged years ago that reported salaries suffer from response bias. To see an illustration of this, take a look at the same Georgetown report we’ve been examining. On page 4, you’ll see that salaries were known for 207 of the 211 graduates (98.1%) working in the largest law firms. For graduates working in the smallest category of firms, just 7 out of 27 salaries (25.9%) were available. For public interest jobs that required bar admission, just 15 out of 88 salaries (17.0%) were known.
Simkovic may think it’s ok for Solomon to discuss medians in his Times column without disclosing the response rate. I disagree–and I think a Times reporter would as well. Respected newspapers are more careful about things like response rates. But whether or not you agree with Solomon’s writing style, McEntee is clearly right that he omitted the response rate on the data he discussed.
So Simkovic, like Solomon, seems to be confused by the manner in which law schools report information on NALP forms. 60% of the employed graduates held private sector jobs, but that’s not the response rate for salaries. And there’s a pretty strong consensus that the salary responses on the NALP questionnaire are biased–even NALP thinks so.
Misleading By Omission
The ABA’s standard employment report has brought more clarity to reporting entry-level employment outcomes. Solomon and Simkovic were not confused by data appearing on that form, but by statistics contained in NALP’s more outmoded form. Once again, their errors confirm the problems in old reporting practices.
More worrisome than this confusion, Solomon and Simkovic both adopt a strategy that many law schools followed before the ABA intervened: they omit information that a reader (or potential student) would find important. The most mind-boggling fact about Georgetown’s 2013 employment statistics is that the school itself hired 83 of its graduates–12.9% of the class. For 80 of those graduates, Georgetown provided a full year of full-time employment.
Isn’t that something you would want to know in evaluating whether “[a]t the top law schools, things are returning to the years before the financial crisis”? That’s the lead in to Solomon’s up-beat description of Georgetown’s employment statistics–the description that then neglects to mention how many of the graduates’ jobs were funded by their own law school.
I’m showing my age here, but back in the twentieth century, T14 schools didn’t fund jobs for one out of every eight graduates. Nor was that type of funding common in those hallowed years more immediately preceding the financial crisis.
I’ll readily acknowledge that Georgetown funds more graduate jobs than most other law schools, but the practice exists at many top schools. It’s Solomon who chose Georgetown as his example. Why are he and Simkovie then so silent about these school-funded jobs?
Final Thoughts
I ordinarily wouldn’t devote an entire post to a law professor’s errors in reading an employment table. We all make too many errors for that to be newsworthy. But Simkovic is so convinced that law schools have never misled anyone with their employment statistics–and here we have two examples of smart, knowledgeable people misled by those same statistics.
Speaking of which, Simkovic defends Solomon’s error by suggesting that he “simply rounded up” from 56% to 60% because four percent is a “small enough difference.” Rounded up? Ask any law school dean whether a four-point difference in an employment rate matters. Or check back in some recent NALP reports. The percentage of law school graduates obtaining nine-month jobs in law firms fell from 50.9% in 2010 to 45.9% in 2011. Maybe we could have avoided this whole law school crisis thing if we’d just “rounded up” the 2011 number to 50%.
Some legal educators have a New Yorker’s view of the world. Like the parochial Manhattanite in Saul Steinberg’s famous illustration, these educators don’t see much beyond their own fiefdom. They see law graduates out there in the world, practicing their profession or working in related fields. And there are doctors, who (regrettably) make more money than lawyers do. But really, what else is there? What do people do if they don’t go to law school?
Michael Simkovic takes this position in a recent post, declaring (in bold) that: “The question everyone who decides not to go to law school . . . must answer is–what else out there is better?” In a footnote, Simkovic concedes that “[a]nother graduate degree might be better than law school for a particular individual,” but he clearly doesn’t think much of the idea.
People, of course, work in hundreds of occupations other than law. Some of them even enjoy their work. Simkovic’s concern lies primarily with the financial return on college and graduate degrees. Even here, though, the contemporary options are much broader than many legal educators realize.
Time Was: The 1990s
Financially, the late twentieth century was a good time to be a lawyer. When the Bureau of Labor Statistics (BLS) published its first Occupational Employment Statistics (OES) in 1997, the four occupations with the highest salaries were medicine, dentistry, podiatry, and law. Those four occupations topped the salary list (in that order) whether sorted by mean or median salary. [Note that OES collects data only on salaries; it does not include self-employed individuals like solo practitioners or partners–whether in law or medicine. For more on that point, see the end of this post.]
Law was a pretty good deal in those days. The graduate program was just three years, rather than four. There were no college prerequisites and no post-graduate internships. Knowledge of math was optional, and exposure to bodily fluids minimal. Imagine earning a median salary of $109,987 (in 2014 dollars) without having to examine feet! Although a willingness to spend four years of graduate school studying feet, along with a lifetime of treating them, would have netted you a 28% increase in median salary.
But let’s not dally any longer in the twentieth century.
Time Is: 2014
BLS just released its latest survey of occupational wages, and the results show how much the economy has changed. Law practice has slipped to twenty-second place in a listing of occupations by mean salary, and twenty-sixth place when ranked by median. One subset of lawyers, judges and magistrates, holds twenty-fifth place on the list of median salaries, but practicing lawyers have slipped a notch lower.
About half the slippage in law’s salary prominence stems from the splintering of medical occupations, both in the real world and as measured by BLS. We no longer visit “doctors,” we see pediatricians, general practitioners, internists, obstetricians, anesthesiologists, surgeons, and psychiatrists–often in that order. These medical specialists, along with the dentists and podiatrists, all enjoy a higher median salary than lawyers.
There are two other health-related professions, meanwhile, that have moved ahead of lawyers in wages: nurse anesthetists and pharmacists. Both of these fields require substantial graduate education: at least two years for nurse anesthetists and two to four years for pharmacists. But the training pays off with a median salary of $153,780 for nurse anesthetists and $120,950 for pharmacists.
Today’s college graduates, furthermore, don’t have to deal with teeth, airways, or medications to earn more than lawyers do. The latest BLS survey includes nine other occupations that top lawyers’ median salary: financial managers, airline pilots, natural sciences managers, air traffic controllers, marketing managers, computer and information systems managers, petroleum engineers, architectural and engineering managers, and chief executives.
How much do salaried lawyers earn in their more humble berth on the OES list? They collected a median salary of $114,970 in 2014. That’s good, but it’s only 4.5% higher (in inflation-controlled dollars) than the median salary in 1997. Pharmacists enjoyed a whopping 28% increase in median real wages to reach $120,950 in 2014. And the average nurse anesthetist earned a full third more than the average lawyer that year.
If you’re a college student willing to set your financial sights just a bit lower than the median salary in law practice, there are lots of other options. Here are some of the occupations with a 2014 median salary falling between $100,000 and $114,970: sales manager, physicist, computer hardware engineer, computer and information research scientist, compensation and benefits manager, purchasing manager, astronomer, aerospace engineer, political scientist, mathematician, software developer for systems software, human resources manager, training and development manager, public relations and fundraising manager, optometrist, nuclear engineer, and prosthodontist (those are the folks who will soon be fitting baby boomers for their false teeth).
Law graduates could apply their education to some of these jobs; with a few more years of graduate education, a savvy lawyer could offer the aging boomers a package deal on a will and a new pair of choppers. But the most common themes in these salary-leading occupations do not revolve around law. Instead, the themes are math, science, and management–none of which we teach very well in law school.
Twenty-first Century Humility
Lawyers will not disappear. Even Richard Susskind, who asked about “The End of Lawyers?” in a provocative book title, doesn’t think lawyers are done for. We still need lawyers to fill both traditional roles and new ones. Lawyers, however, will not have the same economic and social dominance that they enjoyed in the late twentieth century.
Some lawyers will still make a lot of money. As the American Lawyer proclaimed last year, the “super rich” are getting richer. But the prospects for other lawyers are less certain, and the appeal of competing fields has increased.
If law schools want to understand their decline in talented applicants, they need to look more closely at the competition. What do today’s high school students and middle schoolers think about law? Those students will choose their majors soon after arriving at college. Once they choose engineering, computer science, business, or health-related courses, a legal career will seem even less appealing. If we want potential students to find law attractive, we need to know more about their alternatives and preferences.
We also need to be realistic about how many students ultimately will–or should–pursue a law degree. As citizens of a healthy economy, we need doctors, nurse anesthetists, pharmacists, managers, and software developers. We even need the odd astronomer or two. Law is just one of the many occupations that make a society thrive. The twenty-first century is a time of interdependence that should bring a sense of humility.
Notes
Here are some key points about the method behind the OES survey. For more information, see this FAQ page, which includes the information I summarize here:
1. OES obtains wage data directly from establishments. This method eliminates bias that may occur when individuals report their own wages. The survey, however, includes only wage data for salaried employees. Solo practitioners (in any field) are excluded, as are individuals who draw their income entirely from partnerships or other forms of profit sharing.
2. “Wages” include production bonuses and tips, but not end-of-year bonuses, profit-sharing, or benefits.
3. Although BLS publishes OES data every year, the data are gathered on a rolling basis. Income for “1997” or “2014” reflects data gathered over three years, including the reference year. BLS adjusts wage figures for the two older years, using the Employment Cost Index, so the reported wages appear in then “current” dollars. The three-year collection period, however, can mask sudden shifts in employment trends.
4. BLS cautions against using OES data to compare changes in employment data over time, unless the user offers necessary context. In particular, it is important for readers to understand that short-term comparisons are difficult (because of the point in the previous paragraph) and that occupational categories change frequently. For those reasons, I have limited my cross-time comparisons and have noted the splintering of occupational categories. The limited comparison offered here, however, seems helpful in understanding the relationship of law practice to other high-paying occupations.
5. For the data used in this post, follow this link and download the spreadsheets. The HTML versions are prettier, but they do not include all of the data.
What obligations, if any, do academic institutions owe potential students? When soliciting these “customers,” how candid should schools be in discussing graduation rates, scholarship conditions, or the employment outcomes of recent graduates? Do the obligations differ for a professional school that will teach students about the ethics of communicating with their own future customers?
New Marketing/New Concerns
Once upon a time, we marketed law schools with a printed brochure or two. That changed with the advent of the new century and the internet. Now marketing is pervasive: web pages, emails, blog posts, and forums.
With increased marketing, some educators began to worry about how we presented ourselves to students. As a sometime social scientist, I was particularly concerned about the way in which some law schools reported median salaries without disclosing the number of graduates supplying that information. A school could report that it had employment information from 99% of its graduates, that 60% were in private practice, and that the median salary for those private practitioners was $120,000. Nowhere did the reader learn that only 45% of the graduates reported salary information. [This is a hypothetical example; it does not represent any particular law school.]
I also noticed that, although law schools know only the average “amount borrowed” by their students, schools and the media began to represent that figure as the average “debt owed.” Interest, unfortunately, accumulates while a student is in law school, so the “amount borrowed” significantly understates the “debt owed” when loans fall due.
Other educators worried about a lack of candor when schools offered scholarships to students. A school might offer an attractive three-year scholarship to an applicant, with the seemingly easy condition that the student maintain a B average. The school knew that it tightly controlled curves in first-year courses, so that a predictable number of awardees would fail that condition, but the applicants didn’t understand that. This isn’t just a matter of optimism bias; undergraduates literally do not understand law school curves. A few years ago, one law school hopeful said to me: “What’s the big deal about grade competition in law school? It’s not like there’s a limit on the number of A’s or anything.” When I explained the facts of law school life, she went off to pursue a Ph.D. in botany.
And then there was the matter of nested statistics. Schools would report the number of employed graduates, then identify percentages of those graduates working in particular job categories. Categories spawned sub-categories, and readers began to lose sight of the denominator. Even respected scholars like Steven Solomon get befuddled by these statistics. Yesterday, Solomon misinterpreted Georgetown’s 2013 employment statistics due to this type of nesting: he mistook 60% of employed graduates for 60% of the graduating class. (Georgetown, to its credit, provides clearer statistics on a different page than the one Solomon used.)
Educators, of course, weren’t the only ones who noticed these problems. We were slow–much too slow–to address our lapses, and we suffered legitimate criticism from the media and organizations like Law School Transparency. Indeed, the criticisms continue, as professors persist in making misleading statements.
For me, these are ethical issues. I believe that educators do have a special obligation to prospective students; they are not just “customers,” they are people who depend upon us for instruction and wise counsel. At law schools, prospective students are also future colleagues in the legal profession; even while we teach, we are an integral part of the profession.
With that in mind, I communicate with prospective students as I would talk to a colleague asking about an entry-level teaching position or a potential move to another school. I tell students what I would want to know if I were in their position. And, consistent with my role as a teacher and scholar, I try to present the information in a manner that is straightforward and easy to understand. For the last few years, most law schools have followed the same golden rules–albeit with considerable prodding from Law School Transparency, the ABA, and the media.
Revisionist History
Now that law schools have become more careful in their communications with potential students, revisionist history has appeared. Ignoring all of the concerns discussed above (although they appear in sources he cites), Michael Simkovic concludes that “The moral critique against law schools comes down to this: The law schools used the same standard method of reporting data as the U.S. Government.”
Huh? When the government publishes salaries in SIPP, a primary source for Simkovic’s scholarship, I’m pretty sure they disclose how many respondents refused to provide that information. Reports on the national debt, likewise, include interest accrued rather than just the original amounts borrowed–although I will concede that there’s plenty of monkey business in that reporting. I’ll also concede that welfare recipients probably don’t fully understand the conditions in the contracts they sign.
Simkovic, of course, doesn’t mean to set the government up as a model on these latter points. Instead, he ignores those issues and pretends that the ethical critique of law schools focused on just one point: calculation of the overall employment rate. On this, Simkovic has good news for law schools: they can ethically count a graduate as employed as long as the graduate was paid for a single hour of work during the reporting week–because that’s the way the government does it.
I don’t think any law school has ever been quite that audacious, and the ABA certainly would not approve. The implications of Simkovic’s argument, however, illuminate a key point: law schools communicate for a different purpose, and to a different audience, than the Bureau of Labor Statistics. The primary consumers of our employment statistics are current and potential students. We draft our employment statistics for that audience, and the information should be tailored to them.
As for scholarship, I will acknowledge that the U.S. government owns the word “unemployment.” I used a non-standard definition of that concept in a recent paper, and clearly designated it as such. But this seems to distract some readers, so I’ll refer to those graduates as “not working.” I suspect it’s all the same to them.
Cafe Manager & Co-Moderator
Deborah J. Merritt
Cafe Designer & Co-Moderator
Kyle McEntee
Law School Cafe is a resource for anyone interested in changes in legal education and the legal profession.
Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.