Helping international JD students improve their background knowledge of US history, legal system, etc.

Post by Stephen Horowitz, Professor of Legal English

On the Academic Support Professionals listserv the other day, a great question popped up:

“Can anyone recommend resources for international JD students looking to improve their baseline knowledge of the US legal/political systems and/or US history? A faculty member teaching [course name] is looking for recommendations for a student who attended high school and college outside of the US. The faculty member believes this student would benefit from resources that explain the US government at a more basic level than what is covered in the course. Thanks in advance for any resources or leads!”

I really appreciated this question for a few reasons:

  • (1) International JD students (i.e., students who didn’t grow up in the US education system) are a growing segment of the law school community, yet they generally don’t get the same level of legal English support that international LLM students may receive. Plus their needs are often different from both regular JD students and international LLM students.
  • (2) Background and cultural knowledge is such a significant component of comprehension in US law school, yet it’s difficult to acquire if you didn’t grow up with it. And if you did grow up in the US, it’s hard to notice or be aware of the challenges of functioning effectively in US law school without it (or with less of it.)

I’ve been keeping my eyes open for years for resources that can help international LLM students with this, and that’s part of the reason I created the Legal English Resources page on this blog.

But until I saw the question above, I’d never organized my thoughts specifically with international JD students in mind. Yet the answers poured forth quickly and enthusiastically in my email response to the listserv. And so I figured this information might be helpful for others as well.

One of the key qualities of these resources, by the way, is that they generally don’t require much extra work on the part of the professor or student advisor. You can pretty much hand any of these off to students and let them run with it. Or, if they require a little preparation, once you’ve done it once, you don’t have to think about it again after that.

Resources to help International JD students learn important background information about US history, the US political system, and the US legal system.

1. Civics101 Podcast (produced by New Hampshire public radio) – lots of short episodes on a wide range of topics. In their own words, “What’s the difference between the House and the Senate? How do landmark Supreme Court decisions affect our lives? What does the 2nd Amendment really say? Civics 101 is the podcast about how our democracy works…or is supposed to work, anyway.”

2. Street Law: A Course in Practical Law textbook – used primarily for high school students, but great for international students too. Plus a glossary in the back! I’ve used parts of the book with LLM students in the past and also pointed a colleague to it who used several chapters to develop an entire legal English criminal law course for international LLM students.

3. iCivics – an online ed company that creates materials to teach civics to US students. I haven’t had occasion to use any of their materials yet, but an intriguing option worth checking out. Here’s a description of who they are in their own words: “iCivics champions equitable, non-partisan civic education so that the practice of democracy is learned by each new generation. We work to inspire life-long civic engagement by providing high quality and engaging civics resources to teachers and students across our nation.”

4. Newsela.com – It’s a huge extensive reading library of real news and other articles written at 5 different levels of difficulty (or ease.) And it’s accessible for free with registration. While most of it is news articles, there’s also a whole section on civics/US history and a number of articles that might be helpful. For example, I remember they have the Constitution, Declaration of Independence, the Federalist Papers, Brown v Board & Plessy v Ferguson all written and re-written at 5 different levels of difficulty. Also profiles of famous Americans including some presidents, Supreme Court justices, civil rights leaders, etc. But you have to sift through to find some of this stuff. Also, they may have put a paywall up on some of the materials other than the news articles since I last used it.

5. Khan Academy has a slew of video lessons on history and civics. The key is narrowing it down. During the pandemic, I created a Khan Academy “course” for Georgetown LLM students to use by just adding the units and lessons that seemed relevant and told students to register and use it if they want to learn more beyond my actual class with them. In total, I found about 35 different lessons/items that felt relevant and appropriate to include in my “class.” There’s a screenshot below to give you a sense of some of the topics. But feel free to contact me directly if you want to know which ones they are so you can create your own class. Happy to share.

6. The Scrambled States of America (the game)

This game is based on a clever children’s book of the same name. My kids (5, 7 and 11 at the time) got into it during the pandemic, and in addition to being super fun and super easy, within a few weeks they had all absorbed every state, state capital, and state nickname in addition to having a sense of where the states are located. I’ve learned over the years that my international students often have little sense of US geography outside of New York and Los Angeles. US geography is important background knowledge to have in US law school as it often provides vital context. Yet US geography is rarely ever taught to international law students. And when it is, it’s hard to do as effectively as this game does. Let international JD students spend a couple hours playing this and they’ll be all set with their geography. And you’ll have a great time if you play with them!

7. Legal English Resources page on the Georgetown Legal English Blog: In addition to all the items listed above, there are many more on the Legal English Resources page. So I encourage you to take a look. Maybe you’ll find something else there that fits the needs of your students. (Or maybe you’ll have a suggestion for a helpful resource that I didn’t know about!)

New Legal English Book: “Practical English Language Skills for Lawyers”

Post by Stephen Horowitz, Professor of Legal English

I had a great time yesterday moderating the well-attended pre-book launch webinar for a new legal English book published by Routledge titled Practical English Language Skills for Lawyers by co-authors Natasha Costello and Louise Kulbicki, both UK-trained, Europe-based legal English professionals who are active members of EULETA and widely respected in the field.

I also was fortunate to be one of the reviewers for the book and had an opportunity to see how well they incorporate authentic materials and tasks as well as highlight differences in British and American legal English throughout.

Listen to the recording of the webinar below to learn more about the book and get a sneak peak as well as hear the questions from attendees .

Here’s a brief description of the book:

“This book shows non-native, English speaking lawyers how to apply their English language skills to everyday legal situations and contexts, providing essential guidance to ensure they can work confidently in different settings and mediums.

Including activities based on real-life scenarios, the book will allow lawyers and law students to practise their English in key areas of working life, from networking and client meetings, to telephone and conference calls, contract drafting and contract negotiations, presentations and using social media. Written by two highly experienced legal English language tutors, both former legal professionals, it also features online support material that includes listening exercises to complement those based on writing and reading comprehension.”

Tip of the day: How to use ChatGPT to figure out if a text was written by ChatGPT

Post by Prof. Stephen Horowitz, Professor of Legal English

I had a thought about a week ago that perhaps the best way to check and see if a text written by a student was in fact written by ChatGPT or other AI. And it turns out there is a way to do it, but it’s not the way I thought it might work.

I had already tested ChatGPT by giving it the same final exam fact pattern/question I had given to my LLM students. And ChatGPT was able to provide a pretty decent response.

So next, I took that response and fed it back to ChatGPT asking it if ChatGPT had written this text. But ChatGPT responded by explaining that it has no memory, etc. So I figured that was the end of it.

However, I learned today from a commenter on LinkedIn that, as in law school, it’s really about asking the right questions. And in this case, the right questions is:

“Is this text AI-generated?”

This led to a very definite response that, yes, this text was indeed AI-generated. Problem solved!

Update: I learned after publishing this post that this is not exactly “problem solved.” A colleague at another law school shared that he followed a process similar to mine, but ChatGPT believe that the text was in fact written by a human. So the “tip” is far from foolproof.

But I wasn’t satisfied, so I asked it how it knew. And it very quickly and confidently listed off several factors:

“As an AI language model, I can recognize patterns and characteristics in text that are typical of machine-generated text. In this particular case, the text is characterized by a high level of technical language, a formal tone, and a logical structure that is typical of texts written by AI. In addition, the text does not contain any errors or irregularities that would be expected if it had been written by a human. All of these features suggest that the text was generated by AI.”

Wonderful! But wait a second….Lots of people, including most law students at Georgetown and other law schools, are trained to write with these very same characteristics. So I asked, “Can you be more specific?”

And it essentially listed the same characteristics, but this time in a numbered format which made it appear much more specific and persuasive. Even though it wasn’t.

So then I asked it specifically, “What kinds of differences might distinguish a text written by a highly skilled writer and a text generated by AI?” But it listed qualities that might distinguish a human’s writing from AI, such as style, creativity, context and human touch. As a representative of the human race, I guess I’ll take those as compliments. But it still doesn’t provide any concrete examples as to how it can distinguish between a highly-skilled human writer and an AI app like ChatGPT.

In other words, ChatGPT was essentially borrowing from Supreme Court Justice Potter Stewart who famously said in his decision on obscenity, “I know it when I see it.” (Jacobellis v. Ohio, 378 U.S. 184 (1964))

Online legal English for students from politically disrupted countries

Post by Prof. Stephen Horowitz, Professor of Legal English

One silver lining of the COVID-19 pandemic has been increased accessibility and acceptability of online education. And one area this has already provided great benefit in the field of legal English is with online legal English education for students from politically disrupted countries.

Example 1: Female judges fleeing from Afghanistan

I learned this in Spring of 2022 when I was collaborating with Prof. Daniel Edelson of Seton Hall Law School (Daniel is a former legal English colleague from St. John’s Law and founder of USLawEssentials.com) on the creation of an online legal English legal writing course to be offered to foreign-educated attorneys in May/June 2022. As we started to make people aware of the course–which we originally anticipated would be of interest to foreign-educated attorneys preparing for the summer bar exam and/or preparing to start an LLM program in the fall–we were contacted by the Alliance for International Women’s Rights (AIWR) which, among other activities, had been running a mentoring program that matched US lawyers and judges with female judges in Afghanistan prior to the US military withdrawal.

Continue reading “Online legal English for students from politically disrupted countries”

Can ChatGPT help LLMs pass the bar exam?

The good news: Yes, it probably can!

The bad news: But it’s not the LLMs you’re probably thinking of.

I recently noticed in the abstract for the article “GPT Takes the Bar Exam” that the last line reads:

While our ability to interpret these results is limited by nascent scientific understanding of LLMs and the proprietary nature of GPT, we believe that these results strongly suggest that an LLM will pass the MBE component of the Bar Exam in the near future.

At first I did a double take and had to re-read the full abstract to understand how in the heck GPT’s relative success in answering bar exam questions could portend that one lucky future LLM student will pass the multiple choice section of the bar exam.

Then I remembered that LLM is different from LL.M. Because in the context of artificial intelligence, LLM means “Large Language Model” which is the term used to encapsulate what ChatGPT is and which is obviously very different than the Master in Laws or Legum Magister meaning which refers to a one-year degree at a law school and which is often associated with international students in US law schools.

This is clearly a distinction that those of us in the legal English field will have to get used to in order to avoid potential confusion in the future. It also suggests that the periods in “LL.M.” may need to come back in fashion for those out there (like me) who have been trying to get away with leaving them out in the name of efficiency.

Here’s the full abstract, in case of interest:

**********************************

GPT Takes the Bar Exam

13 Pages Posted: 31 Dec 2022

Michael James Bommarito

273 Ventures; Licensio, LLC; Bommarito Consulting, LLC; Michigan State College of Law; Stanford Center for Legal Informatics

Daniel Martin Katz

Illinois Tech – Chicago Kent College of Law; Bucerius Center for Legal Technology & Data Science; Stanford CodeX – The Center for Legal Informatics; 273 Ventures

Date Written: December 29, 2022

Abstract

Nearly all jurisdictions in the United States require a professional license exam, commonly referred to as “the Bar Exam,” as a precondition for law practice. To even sit for the exam, most jurisdictions require that an applicant completes at least seven years of post-secondary education, including three years at an accredited law school. In addition, most test-takers also undergo weeks to months of further, exam-specific preparation. Despite this significant investment of time and capital, approximately one in five test-takers still score under the rate required to pass the exam on their first try. In the face of a complex task that requires such depth of knowledge, what, then, should we expect of the state of the art in “AI?” In this research, we document our experimental evaluation of the performance of OpenAI’s text-davinci-003 model, often-referred to as GPT-3.5, on the multistate multiple choice (MBE) section of the exam. While we find no benefit in fine-tuning over GPT-3.5’s zero-shot performance at the scale of our training data, we do find that hyperparameter optimization and prompt engineering positively impacted GPT-3.5’s zero-shot performance. For best prompt and parameters, GPT-3.5 achieves a headline correct rate of 50.3% on a complete NCBE MBE practice exam, significantly in excess of the 25% baseline guessing rate, and performs at a passing rate for both Evidence and Torts. GPT-3.5’s ranking of responses is also highly correlated with correctness; its top two and top three choices are correct 71% and 88% of the time, respectively, indicating very strong non-entailment performance. While our ability to interpret these results is limited by nascent scientific understanding of LLMs and the proprietary nature of GPT, we believe that these results strongly suggest that an LLM will pass the MBE component of the Bar Exam in the near future.

Keywords: GPT, ChatGPT, Bar Exam, Legal Data, NLP, Legal NLP, Legal Analytics, natural language processing, natural language understanding, evaluation, machine learning, artificial intelligence, artificial intelligence and law

JEL Classification: C45, C55, K49, O33, O30

Suggested Citation:

Bommarito, Michael James and Katz, Daniel Martin, GPT Takes the Bar Exam (December 29, 2022). Available at SSRN: https://ssrn.com/abstract=4314839 or http://dx.doi.org/10.2139/ssrn.4314839

css.php