Menu icon Access the Business Officer Magazine menu by clicking or touching here.
Colorado Lawyer Magazine logo, click or touch this logo to return to the homepage Click or touch the Colorado Lawyer Magazine logo to return to the homepage. Search

Embracing the Inevitable

Integrating AI Technologies in Mediation

May 2024

Download This Article (.pdf)

This article discusses current and potential future uses of artificial intelligence in mediation and suggests how mediators can best integrate AI technologies into their practice.

Given current trends in technology, mediators need to understand artificial intelligence (AI) and have a plan to work with and synthesize AI into their practices. AI is growing rapidly, leading to advancements in a number of fields.1 While technological advancements are typically encouraged, one of the most alarming concerns arising from the growth of AI is the fear that it will replace human workers, with 300 million jobs worldwide expected to be impacted by AI and two-thirds of US jobs at risk from some form of AI automation.2 The legal field is often considered a fertile area for AI automation and large language model learning.3 For example, ChatGPT demonstrated the reality of AI’s advancement in recent years when it “sat for” the July 2022 bar exam and scored near the 90th percentile of test-takers.4 A majority of mediators are attorneys or retired judges, so AI’s integration into the legal field will undoubtedly have implications for mediation.5

This article explores the current and future impact of AI on dispute resolution and suggests ways for mediators to incorporate AI into their practice. It gives a brief background on AI, discusses current and emerging AI mediation technologies, considers the strengths and weaknesses of humans and AI in mediation, and provides suggestions for how mediators can partner with these tools to reach optimal mediated outcomes for clients.

A Short AI Primer

While there are many definitions of “AI,” it is generally understood to be “[t]he theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.”6 AI relies on machine learning, which encompasses “technologies and algorithms that enable systems to identify patterns, make decisions, and improve themselves through experience and data.”7 Natural language processing (NLP) “is a machine learning technology that gives computers the ability to interpret, manipulate, and comprehend human language.”8 Further, “[g]enerative AI represents an advanced subset of NLP models called [large language models] designed to produce human-like text.”9

Mediators have previously embraced technologies such as videoconferencing, online mediation platforms, and electronic document management, and will likely also find benefits to using AI-based technology.10 As discussed below, AI is particularly strong in generating and exploring optimal solutions. Given mediation’s goal of achieving fair resolutions when taking into account each party’s best alternative to a negotiated agreement (BATNA) and worst alternative to a negotiated agreement (WATNA), the value of AI mediation tools in generating legitimate resolutions is apparent.11 AI mediation tools can identify and analyze mediation patterns, which can allow mediators to manage expectations in relation to the BATNA and WATNA of any given mediation.12

Current and Future Uses of AI in Mediation

The “fourth party” is a term often used to refer to technology that assists with resolving disputes online.13 The term was coined as a metaphor for online dispute resolution (ODR), where the fourth party is considered “foundational” and “becoming more capable all the time.”14 Fourth-party dispute resolution has been in use for years. For example, at least as early as 2014, over 90% of eBay’s 60 million annual disputes were being handled with no human intervention.15 AI is considered a tool underlying the fourth party and, with the continued development of AI, the future could include AI-powered fourth parties performing “case research and evaluation (perhaps helping us to envision our [zone of potential agreement], conflict coaching, communication reframing, evaluation of alternatives to a potential settlement, enforcement of outcomes, document drafting and submission to legal bodies, or even automated negotiation or binding algorithmic evaluations.”16 Some have even argued that AI technology could be considered a third party—both in terms of resolving disputes independent of human intervention and also, at least with regard to intake and preliminary communications with mediation parties, in assisting a human mediator by summarizing the conflict and providing potential solutions.17

Today, AI technology is being used in mediation in two primary ways: (1) in a supportive role where a mediator supplements their work with AI or (2) in a substitutive capacity where AI takes on the essential functions of a mediator.18 One ODR divorce mediation system, Family Winner, currently uses AI in a supportive capacity. Each disputing party independently enters property items and their subjective values of the items into the mediation system.19 Then, Family Winner uses AI “to come up with a nominally optimal solution for distribution.”20 The suggested solution can then be accepted or rejected; if rejected, the parties can rank the remaining contested items.21

Researchers have observed that decision support systems like Family Winner fail to adequately optimize “justice” or “fairness” metrics, which are arguably unique to humans.22 Decision support systems have been found effective for certain types of conflicts (e.g., international disputes), but miss the mark on more subjective matters of importance that remain—at least for now—in the domain of humanity.23 A tool like Family Winner can add value to, but not replace, a human mediator because often resolving disputes is based on an individual’s motivations, values, and emotional judgment of fairness.24

Adding value based on party interests and perceptions of fairness is where AI can fall short. Fairness is subjective and “can be distilled into four basic, competing principles or rules—equality, need, generosity, and equity.”25 A party is more likely to perceive an outcome as fair when the outcome more closely aligns with the outcome they anticipated at the outset of negotiation.26 When two disputing parties have a wide gap in their anticipated outcome, at least one party will likely perceive the negotiated outcome as unfair unless value can be created to bridge the divide. While AI tools are considered neutral and thus arguably able to “ensure fairness and promote trust among the disputing parties,” the subjective nature of fairness in mediation requires human emotional intelligence to provide disputants with “analysis and support to make the final decisions they subjectively perceive as fair.”27

Stated differently, decision support systems like Family Winner require “substantial human input.”28 With this in mind, substitutive AI systems in mediation “are still subject to slow development” but are improving at a rapid rate.29 The ability of these substitutive tools’ ability to generate proposed resolutions is especially promising. For example, Split-Up, a case reasoning system (a system that applies past outcomes of cases to the current situation), examines 94 different factors in a divorce and provides suggestions based on the outcomes of previous cases exhibiting similar facts.30 Moreover, these tools could help resolve the access to justice problem for many, particularly those disputants with limited resources, as these AI systems are efficient, especially when resolving low-value cases.31

While the above examples show the utility of AI-assisted ODR platforms in family law cases, systems like ChatGPT that can handle a number of different tasks may also prove especially valuable to mediators. ChatGPT was recently used by an experienced mediator, Myer Sankary, in the background of a landlord-tenant dispute.32 While the landlord and tenant were able to reach a resolution of a $270,000 settlement without the use of AI, ChatGPT impressively arrived at a very similar recommended figure of $275,000.33

There are a variety of other tools available that are less robust in solution generation but nevertheless add a great deal of convenience for disputants, such as eBay’s high-volume ODR system and ICANN’s uniform domain-name dispute-resolution system.34 One such online dispute resolution tool is currently being developed by the Colorado courts for pilot use in small claims court matters.35 When deployed, this tool will allow parties to explore resolutions with the assistance of a facilitative mediator—all of which will be done online and asynchronously so that the participants do not need to travel to a courthouse or mediation office.36 With a range of options available and perhaps unavoidable, now is the time for mediators to understand the role AI tools can play in resolving disputes and explore ways to incorporate these tools into their practice.

How AI Can Assist Mediators

Even as it improves, AI’s best use in mediation will likely be to assist rather than replace a human mediator. Indeed, when prompted if AI could replace a human mediator, ChatGPT provided the following response:

As an AI language model, I can provide information and suggestions based on data and algorithms, but I cannot replace the role of a human mediator. A mediator is a trained professional who helps parties in conflict to reach a mutually agreeable solution. A mediator often relies on a combination of communication skills, active listening, empathy, and the ability to recognize and address power imbalances to help people in dispute find common ground. While I can provide general information on conflict resolution, I cannot replace the human qualities that a mediator brings to the process. Mediation often involves emotional intelligence, which is an area where AI models like myself still have limitations.37

As suggested by the ChatGPT response, the sine qua non of effective mediators is “a pronounced capacity for empathy, compassion, and a desire to understand others.”38 Human mediators can make parties feel empowered, which can promote resolution.39 Ultimately, as has been noted widely, AI cannot replace “human mediators’ interpersonal approach and innovative problem-solving capabilities.”40

Human mediators, however, have flaws. They have limited cognitive capacity, which can slow down mediation or prevent optimal results.41 This limitation also means human mediators can only take on so many disputes at once. Humans also cannot avoid cognitive biases, which can severely impact impartiality.42 Such biases, as well as everyday stressors, can also lead to inconsistent results. Finally, while ethical mediators make every effort to preserve confidentiality, human nature can result in (often inadvertent) lapses in confidentiality.43

AI excels in many of the exact areas in which human mediators are limited. With regard to capacity, AI systems “are able to quickly store, analyze, and access vast amounts of data.”44 AI systems are not physically limited. They can run nonstop and become scalable to “help with the ever-increasing number of disputes that can be resolved with mediation.”45 AI systems can also better guard confidentiality, which could mean parties to a dispute are more willing to share embarrassing or private details.46

However, AI is not without imperfections. For example, AI can reflect biases and inconsistencies because the systems are trained by humans with biases.47 Therefore, using AI requires a human component to monitor for consistency and bias, or lack thereof. Algorithmic transparency and human monitoring are necessary to compensate for any preprogrammed biases embedded in AI technology.48 AI tools tend to be less prone to human cognitive biases when humans work with AI to monitor for algorithmic bias.49 By working together, humans and AI tools can minimize their respective weaknesses to help mediators work more efficiently toward better solutions.

As mediators and attorneys integrate AI into their practice, they need to stay informed about how ethical requirements are evolving to reflect AI’s role in the legal field. Particularly notable for attorneys representing clients in mediation is an ABA competency mandate for attorneys to understand relevant technology such as AI.50 All attorney mediators participating in ODR should also be aware of the ethical framework that governs ODR and technological systems (like AI) employed in dispute resolution.51 Although the use of AI by attorneys is in its early stages, there are a number of ethical and professional conduct issues that attorneys must consider, particularly related to providing client-specific information to an AI system, and there are not necessarily definitive answers yet.52 Such issues include confidentiality, informed consent, bias, and liability.53 For example, while AI systems can theoretically better guard confidentiality, attorneys using AI systems must consider issues of attorney-client privilege, the use of personally identifiable information, and information security when using AI systems with client data to ensure they are not violating any rules in their jurisdiction.54

In Colorado, there is an ongoing discussion as to whether and how conduct rules should be amended to accommodate the rise and usage of AI tools.55 Critics of AI in the legal field point to real-life AI blunders as understandable concerns with the use of AI and argue that attorneys “cannot carelessly cede professional responsibility to AI.”56 Ultimately, there are a number of professional conduct and ethical considerations at play, and interested attorneys should be active in these discussions in order to “lead the responsible adoption of artificial intelligence.”57

A Synergistic Approach to Using AI in Mediation

A synergistic approach—interactions “that when combined produce a total effect that is greater than the sum of the individual elements”—is the best way to balance the strengths and weaknesses of human mediators and AI mediation tools.58 Mediators already understand how to find integrative solutions that maximize value and result in win-win solutions for the parties.59 Partnering with AI for better mediated outcomes should come as second nature to mediators. Combining AI with mediation practice synergistically can assist mediators in finding more, or better, integrative solutions to the complex problems often presented during mediation. AI tools cannot yet replace human mediators, and the human element required during the mediation process casts doubt about whether these tools will ever be able to completely do so.60 But mediators can use these tools now to enhance their practice. Human mediators can view AI mediation tools as a synergistic helper to make their practice more efficient, preempt potential shortcomings or blind spots, assist in brainstorming solutions, and assist in finding optimal solutions. ChatGPT has already been shown to assist mediators in a number of ways, such as searching for and interpreting information, responding to mediator questions, generating possible dispute resolutions, formulating questions, and offering communication tools.61 These tools are available now, and they can be extremely useful to practicing mediators.

AI systems work best when “trained,” and a mediator should think of AI systems as virtual colleagues. Examples of steps that a mediator could use to train and evaluate an AI tool are described below.

  1. Feed an AI-powered mediation tool the relevant rules, guidelines, and best practices related to the mediator’s practice areas.
  2. Provide other inputs (information about parties, goals, and other important factors related to the background of disputes) and previous resolutions from a sampling of prior mediated disputes.
  3. Prompt the mediation tool to recommend possible solutions, which the mediator could use as suggestions for the parties to consider before or during mediation.
  4. Compare these AI-recommended solutions with non-AI solutions recommended or considered in the dispute.
  5. Evaluate the potential shortcomings of the AI’s solution compared to the shortcomings encountered in the actual resolution of the dispute without using AI.

Some questions a mediator may consider in comparing the actual resolution with the AI-generated resolution include:

  • Why were the resolutions different? What were the differences in how the AI mediation tool and the parties prioritized key factors?
  • Which resolution best maximizes the interests of the parties?
  • Which resolution seems more equitable and would be considered by a party as “fair”?
  • Did the AI mediation tool’s solution address an issue of bias that the parties or mediator did not detect? Did the AI mediation tool’s solution reflect bias, inaccuracy, or inconsistency?

After considering these issues, a mediator could feed the AI mediation tool additional data or knowledge that the mediator believes may have been relevant, party-specific priorities. This may impact the “weight” used in the future by the AI mediation tool for specific factors and thus may bridge the gap in an instance where the mediator believes the suggested resolution could have been improved.

Alternatively, if the AI mediation tool’s proposed resolution was a good or better alternative, the mediator could use that to improve their own knowledge and practice. Over time, AI mediation tools would, through specifically tailored data, become better suited at performing helpful tasks, such as providing an answer to a query, brainstorming questions or possible solutions, or drafting a stipulation. Eventually, the AI mediation tool could perform a majority of the routine and repetitive work for simpler disputes. This would allow a mediator to handle simpler disputes at a lower cost, increase workload capacity, and focus on more complex disputes.

Conclusion

AI mediation tools are another form of technology that mediators can integrate into their practices, as they have done with email, remote mediation software, calendaring tools, and billing software. AI-powered mediation tools are best thought of as tools to assist, rather than replace, human mediators. Incorporating AI tools and feeding them data and preferences will allow them to generate better results. Using AI could mean more options for parties and increased efficiency for mediators, allowing mediators to focus on bringing human qualities to the table to help parties overcome impasses in the most complicated disputes. Ultimately, mediators who synthesize AI mediation tools into their practices will be better situated than those who ignore them, as widespread adoption of tools like ChatGPT will lead parties to expect mediators to adopt these tools.

Ryan Searson is a JD candidate in the class of 2026 at the University of Denver Sturm College of Law. He is employed as a financial regulatory analyst with Foley & Lardner LLP, supporting the firm’s securities enforcement and litigation practice—rsearson26@law.du.edu. He sincerely thanks Professor Wesley Parks at Denver Law for his guidance, useful comments, and edits of this article. Coordinating Editor: Wesley Parks, wparks@law.du.edu.


Related Topics


Notes

1. Nosta, “Stacked Exponential Growth: AI Is Outpacing Moore’s Law and Evolutionary Biology,” Medium (Apr. 14, 2023), https://johnnosta.medium.com/stacked-exponential-growth-ai-is-outpacing-moores-law-and-evolutionary-biology-12882c38b68d.

2. Johnson, “Which Jobs Will AI Replace? These 4 Industries Will Be Heavily Impacted,” Forbes (Mar. 31, 2023), https://www.forbes.com/sites/ariannajohnson/2023/03/30/which-jobs-will-ai-replace-these-4-industries-will-be-heavily-impacted.

3. Id.; Onit, “How Large Language Models (LLMs) Can Uniquely Supercharge Vital Legal Work,” Onit blog (Oct. 24, 2023), https://www.onit.com/blog/how-llms-can-supercharge-vital-legal-work.

4. Cassens Weiss, “Latest Version of ChatGPT Aces Bar Exam With Score Nearing 90th Percentile,” ABA J. (Mar. 16, 2023), https://www.abajournal.com/web/article/latest-version-of-chatgpt-aces-the-bar-exam-with-score-in-90th-percentile.

5. See “The Mediator’s Spectrum: Four Reasons to Consider Non-Attorney Mediators,” JD Supra (Feb. 2, 2023), https://www.jdsupra.com/legalnews/the-mediator-s-spectrum-four-reasons-to-5016075; O’Leary, “Why AI Isn’t the Zombie Apocalypse for Legal Professionals,” Thomson Reuters (Sept. 15, 2023), https://legal.thomsonreuters.com/blog/ai-isnt-the-zombie-apocalypse-for-legal-professionals.

6. Artificial intelligence, Oxford Reference, https://doi.org/10.1093/oi/authority.20110803095426960.

7. “Artificial Intelligence (AI) vs. Machine Learning,” Columbia Engineering, https://ai.engineering.columbia.edu/ai-vs-machine-learning.

8. “What Is Natural Language Processing (NLP)?,” AWS, https://aws.amazon.com/what-is/nlp.

9. Onit, supra note 3.

10. Reeve, “The Use of Technology in Mediation,” LinkedIn (May 25, 2023), https://www.linkedin.com/pulse/use-technology-mediation-jerry-reeve.

11. See Holland, “BATNA & WATNA: Finding and Using Negotiation Power,” ADR Times (Mar. 10, 2023), https://www.adrtimes.com/batna-watna.

12. Jackson, “The AI Revolution: A New Frontier for Mediation,” Maneuver Mediation (June 9, 2023), https://mitchjackson.com/2023/06/09/new-mediation-frontier.

13. Rainey, “Third-Party Ethics in the Age of the Fourth Party,” 1 Int’l J. Online Disp. Resol. 37, 39–40 (2014).

14. Wing et al., “Designing Ethical Online Dispute Resolution Systems: The Rise of the Fourth Party,” 37 Negot. J. 1, 3–4 (2021).

15. Rainey, supra note 13 at 39.

16. Wing, supra note 14 at 4.

17. See Melamed, “Is Technology Now the ‘3rd Party’ or ‘4th Party’ in Dispute Resolution?,” Mediate (Oct. 23, 2023), https://mediate.com/should-we-rename-the-third-and-fourth-parties-in-mediation.

18. Alessa, “The Role of Artificial Intelligence in Online Dispute Resolution: A Brief and Critical Overview,” 31 Info. & Commc’ns Tech. L. 319, 326 (2022).

19. Id. at 327.

20. Id.

21. Id.

22. Bellucci and Zeleznikow, “Developing Negotiation Decision Support Systems That Support Mediators: A Case Study of the Family Winner System,” 13 A.I. & L. 233, 236 (2005).

23. See id. at 263; Alessa, supra note 18 at 328.

24. See Welsh, “Perceptions of Fairness in Negotiation,” 87 Marq. L. Rev. 753, 754 (2004).

25. Id.

26. Id. at 754–55.

27. Reeve, “Can AI Take Over From a Mediator? Exploring the Potential of Artificial Intelligence in Conflict Resolution,” LinkedIn (June 21, 2023), https://www.linkedin.com/pulse/can-ai-take-over-from-mediator-exploring-potential-artificial-reeve; Hasselfield, “Human Mediators vs. Artificial Intelligence,” Law360 Can. (Oct. 16, 2023), https://www.law360.ca/ca/articles/1772407.

28. Alessa, supra note 18 at 327.

29. Id. at 329.

30. Id.

31. Id. at 325.

32. Weisheit and Salger, “Artificial Intelligence (AI) in Mediation—ChatGPT as Mediator 4.0,” Mediate (June 21, 2023), https://mediate.com/artificial-intelligence-ai-in-mediation-chatgpt-as-mediator-4-0.

33. Id.

34. “Online Dispute Resolution: Companies Implementing ODR” (Mar. 14, 2024), University of Missouri School of Law, https://libraryguides.missouri.edu/c.php?g=557240&p=3832247.

35. Email from Sharon Sturges, judicial access and inclusion manager, Colorado State Court Administrator’s Office, to Ryan Searson, student, University of Denver Sturm College of Law (Nov. 15, 2023) (on file with author).

36. Id.

37. Weisheit and Salger, supra note 32.

38. Foit, “Your Artificial Mediator Is Ready for You Now: The Role of Artificial Intelligence in Conflict,” 15 Am. J. Mediation 43, 44–45 (2022).

39. Id. at 45.

40. Panetta, “AI is Smart, But It Can’t Replicate the Human Touch in Mediation,” Bloomberg L. (Aug. 17, 2023), https://news.bloomberglaw.com/us-law-week/ai-is-smart-but-it-cant-replicate-the-human-touch-in-mediation.

41. Foit, supra note 38 at 47.

42. Id. at 45–46.

43. Id. at 47.

44. Id.

45. Id. at 54.

46. Id. at 55–56.

47. Manyika et al., “What Do We Do About the Biases in AI?,” Harv. Bus. Rev. (Oct. 25, 2019), https://hbr.org/2019/10/what-do-we-do-about-the-biases-in-ai.

48. Lawton, “AI Transparency: What Is It and Why Do We Need It?,” TechTarget (June 2, 2023), https://www.techtarget.com/searchcio/tip/AI-transparency-What-is-it-and-why-do-we-need-it.

49. Foit, supra note 38 at 54–55.

50. See ABA Model Rule of Prof’l Conduct 1.1 (1983); “Applying Today’s Legal Ethics to Today’s AI (Part 2),” casetext blog (Nov. 17, 2023), https://casetext.com/blog/ethical-use-ai-legal-2.

51. See Standards, International Council for Online Dispute Resolution, https://icodr.org/Standards. See also American Arbitration Association Model Standards of Conduct for Mediators (2005).

52. See, e.g., Swaner, “Risks and Challenges for Responsible AI Use in the Practice of Law,” 83 Iowa Law. 12 (Nov. 2023), https://www.iowabar.org/?pg=IowaLawyerMagazine&pubAction=viewIssue&pubIssueID=33575&pubIssueItemID=195777.

53. Id. at 12–13.

54. Id.

55. See Berkenkotter and Lipinsky de Orlov, “Artificial Intelligence and Professional Conduct,” 53 Colo. Law. 20 (Jan./Feb. 2024), https://cl.cobar.org/features/artificial-intelligence-and-professional-conduct.

56. Volpe, “ChatGPT and the Law: AI’s Negative Impact on the Administration of Justice,” Volpe Law LLC (Aug. 19, 2023), https://www.volpelawllc.com/chatgpt-and-the-law-ais-negative-impact-on-the-administration-of-justice.

57. Swaner, supra note 52 at 14.

58. Synergy, Dictionary.com, https://www.dictionary.com/browse/synergy.

59. See, e.g., “‘Win-Win’ Negotiations in Divorce Mediation,” Weinberger Mediation Center blog, https://www.weinbergermediation.com/blog/mediation/win-win-negotiations-divorce-mediation.

60. Panetta, supra note 40.

61. Weisheit and Salger, supra note 32.