Yellow alert for arbitration


A yellow alert graphic
Credit: bagera3005 (CC BY)

I wrote a post the other day about Arbitrus.ai, an AI “arbitrator” described in a recent paper. I wasn’t sure whether the paper was an elaborate prank meant as a send-up of AI hypesters and folks with strange ideas about formalism in the law. But no, it turns out, the authors are serious, and they have launched a business that, it seems, you could hire today and that would have its AI agent decide real cases involving real people or entities.

Are you worried that an AI cannot make human judgments about what evidence to credit and what evidence not to credit? Not to worry, the authors think that their AI can operate as a lie detecting machine. Are you concerned about consumers, for whom, in the words of the reporters to the new Restatement of the Law of Consumer Contracts, it is “practically impossible” to “scrutinize the terms” of the many form contracts consumers sign every day “and evaluate them prior to manifesting assent?” How naive. Arbitration, they write, “isn’t sunshine and rainbows; it’s largely designed so that you lose. With Arbitrus.ai, you just lose faster and more transparently.

Lawyers interested in the integrity and quality of arbitration as a method of dispute resolution need to think seriously, and now, about how the FAA and the New York Convention would or should treat an agreement that provides for “arbitration” by Arbitrus.ai or another AI bot. Here are a few questions that are meant to provoke thought.

  • Suppose a contract between A. and B. provided that any disputes between them would be decided by a coin toss, and that the coin would be tossed by a person agreed to by the parties, who would prepare a document explaining what the dispute was about and indicating the results of the coin toss. Is that person an arbitrator? Is the document he or she prepares an award? Is the process the parties have agreed to an arbitration?
  • Suppose the President wants to appoint an AI bot to be a judge. Could an AI bot be an “officer” under the Appointments Clause? If not, why not?

My basic thought about AI in the law is that law is a social phenomenon that is made by humans for humans. Only humans can have rights, because the law is made for us. I wrote about this in one of my posts on animals’ supposed right to habeas corpus, where I came at it from the other direction, so to speak:

If law, or maybe Law, is a social phenomenon, then it’s for us (that is, us humans), and is meaningful to us and only to us. It’s a big part of the way we organize ourselves. Other animals have other systems. Chimpanzees apparently have interesting and sophisticated ways of living together. It just seems like a very basic mistake to treat nonhumans as subjects of the law rather than objects in the law. If you say, “human infants, the insane, etc. lack capacity just as a chimpanzee lacks capacity,” I would answer, “human infants, the insane, etc., are part of humans society in a way that chimpanzees are not.” No doubt intrepid primatologists can insert themselves into a chimpanzee society and participate in it in some way for purposes of study (because unlike chimpanzees, humans are wicked smart and can do things like anthropology and primatology). But that’s for the purpose of study, not for the purpose of leading a life. And a chimpanzee cannot, by its nature, be a part of human society. That’s not to say there’s something wrong or deficient about chimpanzees. That’s just how they are. To repeat, this isn’t a point about the philosophical question of what constitutes a person; it’s a point about whom is the law for. It’s for us.

Portrait in black and white of Wesley Hohfeld, wearing an old-fashioned suit.
Wesley Hohfeld (public domain).

I think one can make a similar point about whether entities other than human beings can have duties under the law, and about whether entities other than human beings can have powers under the law. If law is by humans and for humans, does it make any sense to say that someone other than a human being can be, say, a judge, or an arbitrator?

Given the speed of change in the field of artificial intelligence, it behooves us to get ahead of this issue, so that we are ready conceptually to face the challenge that AI decisionmakers will pose.

I want to be clear that I am not saying there is no place for AI in arbitration or in any other area of the law. The American Arbitration Association has just announced that it is making available to arbitrators the Clearbrief AI writing tool. Now, I have also expressed skepticism about the overuse of AI in legal writing, but there’s nothing conceptually wrong with using AI in some aspects of the writing process. And there’s a big difference between tools to help humans and tools to replace humans and do things that only humans really can do, like exercise human judgment.


Leave a Reply

Your email address will not be published. Required fields are marked *

Thank you for commenting! By submitting a comment, you agree that we can retain your name, your email address, your IP address, and the text of your comment, in order to publish your name and comment on Letters Blogatory, to allow our antispam software to operate, and to ensure compliance with our rules against impersonating other commenters.