Menu icon Access the Business Officer Magazine menu by clicking or touching here.
Colorado Lawyer Magazine logo, click or touch this logo to return to the homepage Click or touch the Colorado Lawyer Magazine logo to return to the homepage. Search

Legal Research in the Age of AI

Using Hindsight to Inform the Future

May 2024

Download This Article (.pdf)

Many, many years ago, the release of the Internet revolutionized the world. The World Wide Web sped up globalization by making communication, the transfer of knowledge, and commerce globally accessible in a matter of minutes. Ever since this transformative technology was released to the masses, every technology company has been seeking the next “Internet”—the next big thing. In recent memory, the next big thing has taken the form of digital assistants, blockchain, and, as of late, generative artificial intelligence (AI).

Generative AI technology provides plenty of promise and peril. This technology has appealed to a wider society, not just the legal field. People throughout the world have extensively extolled its virtues. The common human can simply log on to any social media platform and be inundated with influencer after influencer exclaiming that learning to use generative AI should be everyone’s first priority. Professionals are also being overwhelmed with numerous articles from trade blogs, magazines, and newsletters that discuss the effective use of generative AI for specific tasks in their field.

The three most widely known generative AI platforms are Open AI’s ChatGPT, Google’s Gemini (formerly known as Bard), and Anthropic’s Claude. At the time of this writing, these three platforms were all available to sign up for and use, albeit at a cost for their premium products. Within the legal profession, companies have, or are working toward, ways of using generative AI in meaningful ways. RELX’s LexisNexis,1 Thomson Reuters’ Westlaw,2 and Bloomberg Industry Group’s Bloomberg Law3 are all experimenting with adding generative AI to their legal research platforms. Most have released these generative AI features to law schools, law firms, and other select groups for testing and implementation. Even the company that merged with Fastcase, vLex, offers a generative AI legal assistant.4 As these releases are being worked on, each of these companies is doing its best to advertise its generative AI products to legal professionals.

For many, generative AI appears to fulfill a promise made so long ago by legal research platform providers—to streamline the legal research process so lawyers can maximize value to the client in other avenues. Although legal professionals are being overloaded with information regarding generative AI, it is important to remind ourselves that it is merely another tool in the legal resources tool belt. It is crucial that as the profession starts to embrace this new tool, we ground ourselves in the foundation of our practice—ethical decision-making, exceptional client service, and solid legal research processes. This article provides a brief overview of generative AI, discusses one lawyer’s (mis)use of generative AI, and explains how some foundational research methods can be successfully paired with generative AI.

Hey, Generative AI, Tell Me About Yourself

The term “AI” has been bandied around for quite some time. Siri, Alexa, Cortana, and the rest of the digital assistant ilk are types of AI. Natural language searching on Google and Bing, and within the leading legal research platforms, is also a kind of AI. “In its simplest form, AI is the overarching description for technologies that use computers and software to create intelligent, humanlike behavior.”5

Generative AI, on the other hand, refers to a very specific type of AI. In its simplest terms, generative AI can generate new content based on prompt inputs.6 This differs from the examples above as the output provided is actually new, whereas digital assistants and natural language sourcing simply direct the user to already existing content. Depending on the generative AI platform, this generated content can be video, graphical, or textual. As noted by Colin E. Moriarty in a recent Colorado Lawyer article, “lawyers have a special interest in generative AI because it seems capable of performing or assisting with many of the mechanical aspects of law practice, such as document review, legal research, legal writing, and blogging.”7 This text-based AI-generated content is arguably where lawyers and generative AI are destined to meet.

Unfortunately, text-based generative AI has two large drawbacks. First, the generated content is not always accurate. Inaccuracies can range from answers that are categorically wrong to answers that cite to made-up resources. These inaccuracies have been termed “hallucinations.” Second, the generated content is almost always presented as factual. Unlike Bing or Google, which provide websites with possible answers to a search query, the generated content provided by generative AI software is usually written authoritatively. This confident presentation can be deceptive and may lead researchers to believe that hallucinated results are, in fact, accurate.

Colorado Springs Break

Most of us probably think we’re immune to AI pratfalls, but last year at least two attorneys made headlines by filing documents to their respective courts with cases pulled from generative AI platforms. The first confirmed submission was made by a New York licensed attorney. The second attorney was located closer to home, in Colorado Springs.

The Colorado attorney prepared a motion to set aside judgment.8 In the motion, the attorney cited case law retrieved from ChatGPT.9 The attorney did not read or review the cases and submitted the motion to the court.10 At some point, the attorney learned the cases provided by Chat GPT were fabricated or wrong.11

Unfortunately, the attorney never informed the court of this issue, neither in writing nor at a hearing, and did not withdraw the motion.12 The attorney also falsely attributed the mistake to an intern.13 It was not until six days after the hearing that the attorney admitted to using ChatGPT. The attorney was suspended for his misconduct.14

ChatGPT-ogether

This unfortunate example does not mean that generative AI should be avoided, but it does mean that traditional legal research methods are still needed. From a legal research perspective, the Colorado attorney failed to perform two key steps: (1) reading the cases, and (2) reviewing the cases. Both steps should have been covered in any introductory legal research course the attorney took prior to graduating law school.

Step 1: Reading the Cases

Regardless of what technology they are using, an attorney must read every cited case and every cited resource. This is a time-consuming but necessary step in legal preparation that no tool will eliminate. Reading the cited material enables the attorney to (1) verify that the resource exists, (2) determine how the resource best applies to their client’s current issue, and (3) refine their argument by determining whether the cited resource is truly the best to cite.

Step 2: Reviewing the Cases

The attorney must also review (verify, refine, or update) every case regardless of what technology they are using. The Colorado attorney could have fulfilled this step by pairing the use of ChatGPT with any number of traditional legal research platforms. This pairing could have taken various shapes, as discussed below.

Manually searching for the cases. This pairing is one of the more time-consuming ways to authenticate the cases promulgated by generative AI. Here, the researcher copies the case citation, party names, or docket number manually and conducts a search within a traditional legal research database such as Westlaw, Lexis, or Fastcase. Any hallucinated cases would not show up in the legal research database.

Using a drafting tool. This pairing offers a quicker approach to reviewing cases. Here, the researcher uses a legal research platform’s legal drafting aid to review the reliability of cases cited within an uploaded document. These include:

  • Bloomberg Law’s Brief Analyzer. This tool uses a form of AI known as machine learning to review the accuracy of citations and quotes, check or locate authority, and more.15
  • Fastcase’s Cloud Linking. This free tool automatically creates hyperlinks in the uploaded document to the corresponding case located in Fastcase.16
  • Lexis+’s Document Analysis. This tool leverages AI to scan uploaded documents for a variety of information, including providing a Shepard’s analysis on citations within the document.17
  • Westlaw’s Drafting Assistance. This tool verifies citations by inserting KeyCite flags, checks the citation format, creates the Table of Authorities, and more.18

Each of these resources would, in theory, cut the time the legal researcher spends on checking citations by an exponential amount. This is especially true because each of these platforms allows researchers to drag and drop their documents into the platform. The researcher can then complete other tasks while the platform runs its analysis.

Consulting a law librarian. Finally, if a researcher cannot locate a case using a traditional legal research database, they can turn to a law librarian for help. It should be safe to assume that any case that both the researcher and a law librarian cannot find is probably hallucinated. This third method affords the researcher a second set of eyes to review whether a case exists.

Conclusion

Ultimately, hindsight is 20/20, and we cannot travel back in time to fix our mistakes. Rather, we must press on and move forward—and continue to learn, grow, and improve. The Colorado attorney discussed in this article has learned a lesson and is attempting to leverage AI to democratize legal services.19

As for the rest of us, we can take a step back to appreciate the boon and follies that generative AI will have in the legal field. We can cautiously use emerging technology while simultaneously implementing the tried-and-true existing tech we were taught in law school. This does not mean we cannot learn new skills and innovate. Rather, we must cautiously implement these new tools in ways that do not harm the client or impede our candor to the court.

Aamir Abdullah is the instructional services and research librarian at the University of Colorado Law School’s William A. Wise Law Library. Previously, he practiced law for over five years in Texas, where he handled both state and federal cases. Professor Abdullah is passionate about access to justice and the intersection of law and technology—aamir.abdullah@colorado.edu. Coordinating Editor: Michelle Penn, michelle.penn@colorado.edu.


Related Topics


Notes

1. LexisNexis, “Introducing Lexis+ AITM: The Future Is Now” at 1, https://lnlp.widen.net/s/5ffc5jrwql/lexis-ai-choose-the-right-tool.

2. “Introducing AI-Assisted Research: Legal Research Meets Generative AI,” Thomson Reuters blog (Nov. 15, 2023), https://legal.thomsonreuters.com/blog/legal-research-meets-generative-ai.

3. Bloomberg Law, “Artificial Intelligence for Lawyers Explained” (Aug. 1, 2023), https://pro.bloomberglaw.com/insights/technology/ai-in-legal-practice-explained/#whatAI.

4. Reynolds, “vLex Releases New Generative AI Legal Assistant,” ABA J. (Oct. 17, 2023), https://www.abajournal.com/web/article/vlex-releases-new-generative-ai-legal-assistant.

5. Bloomberg Law, supra note 3.

6. Cassens Weiss, “Lawyers Should Take These Precautions When Using Artificial Intelligence, Florida Ethics Opinion Says,” ABA J. (Jan. 23, 2024), https://www.abajournal.com/news/article/lawyers-should-take-these-precautions-when-using-artificial-intelligence-florida-ethics-opinion-says.

7. Moriarty, “The Legal Challenges of Generative AI—Part 1: Skynet and HAL Walk Into a Courtroom,” 52 Colo. Law. 40, 41 (July/Aug. 2023), https://cl.cobar.org/features/the-legal-challenges-of-generative-ai-part-1.

8. People v. Crabill, No. 23PDJ067 (Colo. O.P.D.J. Nov. 22, 2023), https://coloradosupremecourt.com/PDJ/Decisions/Crabill,%20Stipulation%20to%20Discipline,%2023PDJ067,%2011-22-23.pdf.

9. Id.

10. Id.

11. Id.

12. Id.

13. Id.

14. Id.

15. Bloomberg Law, Using Brief Analyzer, https://www.bloomberglaw.com/external/document/XBKJB4IK000000/litigation-overview-using-brief-analyzer.

16. Suskin, “Fastcase Features: A Quick Guide for Former Casemaker Users” (Feb. 2022), https://cl.cobar.org/departments/fastcase-features.

17. Lexis+ Brief Analysis, https://www.lexisnexis.com/en-us/products/lexis-plus/document-analysis/brief-analysis.page.

18. Thomson Reuters, “Drafting Assistant Essential User Guide” at 4 (2018), https://training.thomsonreuters.com/media/Drafting%20Assistant%20Essential%20User%20Guide/1_usx5td9g.Pg 4

19. See https://www.theaijd.com/about.