January 17, 2018

Law Jobs in the Age of Artificial Intelligence

From an Op-Ed by Philip Segal at Above the Law, Jan. 17, 2018:

 

There are a lot of frightened lawyers out there, scared that artificial intelligence will gobble up their jobs. Some lawyers are right to be scared: the ones who don’t do enough thinking while they make their living.

Think of all times when you’re on the phone with a customer service person and are getting an answer that makes no sense to you (but seems perfectly fine to him). That rep who can only explain his company’s policy with, “That’s what the computer is saying,” is like the lawyer whose job is doomed.

For a bright employment future, you want to be the lawyer who looks at the answers AI produces, not just the one who asks the computer questions. For those lawyers, and especially those who have a role in establishing the facts of a legal matter, the future with AI is bright indeed.

How can I tell? Because contrary to a lot of what you read about AI, we have extensive experience with how it works and how we work with it. QuickBooks is AI (now called “software” because we’re used to it). People who used to make a living adding up columns of numbers and transposing them into ledgers now have jobs doing do something else. But accountants who must translate real life into an accounting statement that adheres to a gigantic and ever-changing tax code, subject to tens of thousands of human decisions as to interpretation? They’re still gainfully employed.

For lawyers, the basement dwellers in discovery who sort millions of documents into “relevant” and “not relevant” piles are toast. The lawyers who figure out what else to ask for, what to say at depositions based on what the other side produces, how to talk settlement, how to get ready for trial – they’ll continue to be in demand as long as there’s such a thing as litigation.

Yet, my experience working with law firms and training lawyers tells me that most firms could be doing a better job in getting the best out of their software, new and old. The better you manage AI, the more in demand you’ll be.

If you operate any kind of AI program, consider these issues.

1. Before you ask your computer tough questions, see how good it is at answering easy ones.

Even the “simple” AI of yesteryear makes colossal mistakes and omits things that a sharp human needs to catch.

A recent study at the University of Colorado of current, simple databases dealing in a small universe of data – U.S. case law– sought to gauge the extent of human bias in the different algorithms written for each database. After asking each program for a list of the ten leading cases for the concept “the right to receive information,” just seven percent of the 3,000 cases reviewed showed up in all six databases. The computers missed a lot.

Google is AI, but has stored very little of all the information you know about yourself. Google yourself and see. If you’re leaning heavily on Google to come to your conclusions, you’re missing a ton of information. Westlaw and LexisNexis are bursting with stale information (houses you haven’t lived in for a decade), not to mention information that’s just plain wrong. It will mix up John L. Austin with John B. Austin, even though the middle initials are different. And that’s just the “simple” programming we use now.

If Google’s no good at telling your story, why should it be any better at researching that opposing party?

2. Are you getting lazy and asking the computer to answer non-binary questions?

Computers think in a binary way (1 or 0, yes or no). They can’t guess, speculate or imagine, all critical parts of human thought. They aren’t set up to answer such questions as:

  • “How well-regarded in his profession is this person?”
  • “Does this person have substantial assets?”
  • “Who would help impeach this witness’s credibility?”

You need to be able to take the binary answers and use your imagination, empathy, and other human traits to turn data into useable information for the law firm.

3. Are you depending on software vendors alone to train you in AI?

That’s like allowing the car dealer to do the test drive for you. Vendors come in with planned searches that make their products look good. You need to use a product for something you need, not something the vendor presumes you need. Use a case you know well, a client, or your own law firm as a control experiment. You may buy the software in the end, but you’ll have a very good idea of its blind spots.

Training is critical, because what lawyers will be asked to do is changing. Computing power may be increasing, but as it does, so is the amount of information available. We’ll still need people to analyze the suggestions computers hand us based on transcribed podcasts, YouTube transcripts, and much more. But being good at practicing law doesn’t necessarily mean a person will be good at investigation in an unlimited universe of facts. Law schools in the U.S. are nearly uniform in their omission of fact investigation as a subject in their curricula.

Management professor Ed Hess argues that, in the age of AI, smart machines can process, store, and recall information faster than any person, so the skills of memorizing and recall (widely rewarded on exams) are not as important as they once were.

If you’re a lawyer and you come to a conclusion from a computer printout without being able to substantiate it with public records or convincing interviews, then you are that customer service rep who says, “That’s what the computer is telling me.” Get ready to find somewhere else to work.


Philip Segal is a New York lawyer and Managing Member of Charles Griffin Intelligence LLC. His firm performs fact-finding for lawyers in litigation, M&A and high value asset identification. Segal lectures to bar associations across the country on investigative techniques and ethics, and provides in-house training on research techniques and computer-assisted investigation. He is the author of The Art of Fact Investigation (Ignaz Press, 2016) and forthcoming article in the Savannah Law Review, Legal Jobs in the Age of Artificial Intelligence: Moving from Today’s Limited Universe of Data Toward the Great Beyond, on which this article is based. The article is available in draft form here.

April 11, 2017

Divorce and Your Money Show

 Philip Segal was interviewed at length about asset searching by Shawn Leamon on the Divorce and Your Money Show. Here.

 

March 22, 2017

Senate confirmation hearings for Neil Gorsuch

Philip Segal was interviewed on Fox News about the Senate confirmation hearings for Neil Gorsuch to be a Justice of the U.S. Supreme Court here.

 

October 13, 2016

Donald Trump libel suit against the New York Times

Philip Segal was interviewed on Fox News about a possible Donald Trump libel suit against the New York Times here.

September 28, 2016

Review: The Art of Fact Investigation

 The Art of Fact Investigation by Philip Segal was reviewed in the magazine of the Foreign Correspondents' Club of Hong Kong, here.

August 14, 2016

CBS News: Brendan Dassey "Making a Murderer"

 Philip Segal was interviewed on CBS News about the suppression of the confession of Brendan Dassey, the subject of the Netflix series "Making a Murderer."  Video here.

 

July 04, 2016

Fox News: TSA airport checkpoint treatment of disabled girl

 Philip Segal was interviewed on Fox News about the disabled girl who was bloodied at a TSA airport checkpoint. Video

here.