George Henry Thomas went to work as a law clerk in nineteenth century Virginia. Fortunately for the United States, he found that the work “lacked excitement” and he enrolled in the United States Military Academy at West Point. After Thomas gained field experience, he was invited back to West Point as an instructor. There, Thomas gained both the respect and friendship of the Academy’s commandant, Robert E. Lee.
Thomas and Lee later traveled to the southwest, serving on military missions and deepening their friendship. The two particularly shared a love of their homeland Virginia.
And then Virginia seceded from the Union.
We all know what happened to Lee. He declined a top post in the Union Command and renounced his oath to the United States. He led the confederate army for much of the Civil War, defending an economy and lifestyle based on white ownership of black slaves. He invaded the nation he had sworn to protect, killing more than 5,200 Union soldiers at Gettysburg and Antietam alone: that’s more deaths on American soil than the number who died during the homeland attacks on Pearl Harbor or the World Trade Center.
Overall, Americans suffered more casualties in the Civil War than in all other wars combined.
But what happened to Thomas? Despite his love of Virginia and family ties to that state, he refused to break his oath to the United States. The erstwhile law clerk commanded Union troops throughout the Civil War, from Mill Springs (where he gave the Union its first serious battlefield victory) to the March on Atlanta. Thomas’s family renounced him for remaining loyal to the United States; his confederate friends called for him to be hung as a traitor.
When the war ended, Thomas led troops overseeing Reconstruction. He helped defend freed slaves from local governments and the newborn Ku Klux Klan. In 1868, he warned about attempts to lionize the confederacy:
The greatest efforts made by the defeated insurgents since the close of the war have been to promulgate the idea that the cause of liberty, justice, humanity, equality, and all the calendar of the virtues of freedom, suffered violence and wrong when the effort for southern independence failed. This is, of course, intended as a species of political cant, whereby the crime of treason might be covered with a counterfeit varnish of patriotism.
How many statues have Americans erected to honor the man who kept his oath to his country, fought against slavery, and recognized the evils of romanticizing the confederacy? Just one (in Washington, D.C.).
How many statues have we erected to Lee, the man who broke his oath, defended slavery, invaded his former country, and led a war that killed more than half a million Americans? Too many.
We need not excoriate Lee today: reconciliation is part of ending conflict. But it’s long past time to take down all the statues, and we are sadly mistaken to honor him as a leader. We need to come to terms with the way in which Americans have romanticized the confederacy and its culture.
The Council of the ABA Section of Legal Education and Admissions to the Bar has weathered significant criticism over the last few years. Some of that criticism has been well founded; other attacks have been unfair. But now the Council is acting as its own worst enemy–pursuing a course that has already provoked significant criticism in the legal academy and probably will attract negative attention in the press.
As Jerry Organ explains in a detailed column, the Council voted in June to make several changes in the form used to report law school employment outcomes. The Council acted without any public notice, without following its usual processes, and without gathering input from anyone outside the Council. The lack of process is especially disturbing given: (a) some of the changes had previously provoked vigorous debate; (b) the Council had previously rejected some of the proposals in light of that debate; and (c) the Council–along with legal education more generally–has been accused of lacking transparency.
I am sure, as Council Chair Gregory Murphy has written, that the Council acted in good faith–believing that the changes would receive “universal, or near universal, acclamation.” But that’s the problem with disregarding process and input: a small group of decision makers can persuade themselves that they know best. This case is a good illustration of how even highly educated, well intentioned groups can fall prey to that fallacy.
The following was part of a series published by the National Law Journal called Law Schools Are Losing Smart Applicants. How Do They Lure Them Back?” The NLJ asked 11 people from inside and outside the legal academy for responses, including me and Kyle McEntee. His response has been republished here.
Law schools should set reasonable list prices that reflect the earnings available to their graduates. Our high sticker/discount system requires applicants to commit to legal education, invest significant time and money studying for the LSAT, and risk rejection from multiple schools—all before they learn the true cost of their legal education. That system discourages the type of careful thinkers and planners who once found law school attractive.
On campus, we should integrate much more hands-on work throughout the curriculum. Millennials like to do things, not just read about them. Employers, clients and cognitive scientists agree that “doing” is essential to develop professional expertise. Until we embrace that wisdom, we won’t attract talented students back to law school—or prepare them to serve their clients effectively.
Finally, we should replace mandatory grading curves with more nuanced assessments of student learning. Outcome-based assessment helps students focus on the specific knowledge and skills they need to master. Students learn more and employers receive more helpful information about a graduate’s abilities. An educational program that promises to foster expertise, rather than ranking students on a fixed curve, will draw more talented applicants.
The Council of the ABA Section of Legal Education and Admissions to the Bar has granted provisional accreditation to the University of North Texas at Dallas College of Law. As I wrote last fall, this innovative law school well deserved a chance to try its wings.
Here are some distinctive features of the school:
The first group of 74 graduates will receive their degrees this month–and those degrees are now from an ABA-accredited law school. Godspeed UNT-Dallas and grads!
Altman Weil has released its annual report on “Law Firms in Transition.” The report, based on a survey of managing partners of law firms with at least 50 lawyers, documents continued change in the way law firms staff their matters.
More than half of these law firms now use contract lawyers (57.1%) or part-time lawyers (52.7%). Almost half (41.8%) employ staff attorneys. Larger firms (those with at least 250 lawyers) are more likely than smaller firms to use these staffing strategies. Indeed, about three quarters of those larger firms use contract lawyers (77.0%), part-time lawyers (71.3%) or staff attorneys (78.2%). The numbers, however, are still significant at firms with 50-249 lawyers–especially for contract lawyers. More than half (50.4%) of the mid-sized firms use those lawyers.
Law firms have adopted these strategies because they increase profitability. Sixty-nine percent of the surveyed leaders indicated that “shifting work to contract/temporary lawyers” has resulted in a “significant improvement” in that metric. Almost half (49.5%) thought that “shifting work from lawyers to paraprofessionals” had the same salutary effect.
Law firms have also started to push the next frontier in staffing client matters: by using artificial intelligence (like IBM’s Watson) for some analyses. More than a third of law firms (36.3%) have started using those tools or “have begun to explore” those opportunities.
These results are not surprising to anyone who has followed law firm trends since the Great Recession. They underline, however, firms’ enthusiasm for these new staffing models.
H/t to TaxProf for noting the availability of Altman Weil’s report.
Douglas Kahn has posted an article criticizing the “proliferation of clinical and other experiential courses” in legal education. These courses, he argues, reduce the number of “doctrinal” courses that students take, leaving them “ill-prepared to practice law as soon after graduation as law firms would like.” The TaxProf Blog posted a summary of the article, and a baker’s dozen of readers have offered pro and con comments.
It’s an old debate, one that has bristled for more than 50 years. The discussion doesn’t surprise me, but Professor Kahn’s ignorance of clinical education does. His bold assertions about clinics reveal little familiarity with the actual operation of those courses. Let’s examine some of Kahn’s claims.
The ABA has posted its report on employment outcomes for the Class of 2016, along with two school-by-school spreadsheets. One of the spreadsheets tracks law school funded jobs that require bar passage; the other details other employment outcomes. My initial take-aways are:
Overall, the report suggests continued weakness in the entry-level job market for law graduates. The decline in the absolute number of graduates holding full-time, long-term jobs requiring bar admission is worrisome–especially since we take that measure a full 10 months after graduation. Even more troubling is the fact that 10% of the nation’s law graduates are unemployed and seeking work a full ten months after graduation.
Cafe Manager & Co-Moderator
Deborah J. Merritt
Cafe Designer & Co-Moderator
Kyle McEntee
Law School Cafe is a resource for anyone interested in changes in legal education and the legal profession.
Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.