Really? Even with all the data showing how biased Human judges are?
At my day job, I automate everything because humans can't reliably perform basic tasks.
I'm not a believer in web3 or crypto but believe computers to be more impartial and would rather see them eventually take over certain aspects of legal work.
Automation is valuable in itself. Imagine a system of thousands of smart contracts interacting with each other millions of times a second. No court system is (currently) equipped to deal with that except in the most superficial way.
There's major $$$, legal, and security ramifications for clients in many cases. Having an AI that can't properly deal in ambiguity and hallucinates an outright reckless idea 1% of the time is completely unacceptable.
Writing code, sure. A human ultimately reviews it. I suspect in the legal world a lot of legal writing can also be automated to some degree. But strategic decisions, designs, etc. very much need a human pulling the trigger.
That the legal system has flaws isn't a good argument for allowing those flaws to become automated. If we're going to automate a task, we should expect it to better, not worse or just as bad (at this stage it would definitely be worse).
I work in legal tech (AI). No, it's not automating away work, much more work is being done to increase ability to do _interesting_ and otherwise challenging work (say, more efficient research loops, so a lawyer can uncover more legal connections between documents).
Higher end jobs have not been automated to any significant degree yet. Accounting, something you'd assume is easy to automate with software, is still growing. There are more than ever. The only automation done in the legal industry is word processing software and legal search engines--which actually still really suck compared to google.
All these claims about the professional class being put out of work are based on a superficial view of the industries. Lawyers don't exist to spit out the law for you.
Any AI that can lawyer for you is advanced enough to write software itself.
We still haven't even automated repetitive jobs yet. Maybe these people's kids or grand kids will have to worry about it, but not them.
Human judges verifying the data for outcome seems akin to proof reading. If computers get more advanced, less "proof reading" is needed. It seems like automating lawyer work will make society a more fair place, if you agree with me that more cases being brought up is better than no sue or settlement. Price could also go down this way.
I'm very curious if there'd be a different take on common law and civil law when it comes to AI. There's already an AI called NDA Lynn which checks whether an NDA is "fair" for you or not. Its essentially free if you're OK with your data becoming part of the ML base (if not it costs 45 EUR IIRC). [1] The person who made this is a lawyer and isn't afraid this type of tech will put lawyers out of business.
Reading Neal Stephenson's The Diamond Age currently, and it was apparent to me that Bud's hearing was all done by humans, and a judge. It seemed to me that it was just that it was crystal clear that he was guilty that the verdict was cast so quickly? Or did I miss something?
As a technologist, I feel there are problems we can solve using computer, and there are problems we cannot solve using computer.
I am a big fan of smarter computation, but when it comes to legal judgement, I defer to a human. We all have heard of bizarre rulings before (I don't have to remind everyone the Stanford rape scandal last year), but the human involvement in making a judgement call is what makes justice system precious.
I am a big fan of kicking out Donald Trump out of office. I would describe this secret algorithm as Donald Trump. Some data were collected, not sure how much, how authentic, and how much bias has been introduced. We just know some answer is produced. The algorithm might be as simple as tossing a coin. If I can't trust the leader of our government, how can I trust a machine, whether secret or not, making judgement call, when human is prone to making poor and irrational judgement call.
So why human if human are prone to make mistake and to make unfair judgement? Because there should be humanity in justice. Yes, Lady Justice is blind-folded, but that doesn't mean we can't show compassion or anger.
Is there a real correlation of crime rate and number of years in prison? I heard many said criminals are likely to commit crime again because either the criminals have no other skills to depend on, or they have mental illness that prevented them from obeying laws. So if the dataset says 90% of recurrence rate, are we going to sentence people longer? Then why not lock the person up or go for an immediate execution if we want peace?
You see, the purpose we want to add to a jail senetence is correction. This is not an ideal talk. There are many convicts do turn out right and fine if they are given the chance to redeem themselves. We shouldn't be begging for a safe prison, a prison with staff ready to help, because those should be a requirement of a jail.
I can't help but to remind myself Futurama, there are robot judges (one of the cops is also a robot). We should fear people trying to robotized our humanity. If judging can depend on data, then raising a kid from infant to adulthood could be done using algorithms too. We just need lots of data, lots of simulations.
I think the machines should do the scut work to free up humans to do the really important human stuff, like parenting, teaching, and, yeah, lawyering and judging too.
Another angle: if there is some legal task that's simple enough for machines to do reliably then (almost by definition) that task will turn out to be pointless bureaucratic busywork, eh?
Last but not least, who gets to keep lawyer-bot's pay?
- - - -
edit to add a link to James Mickens' USENIX Security Keynote address: "Why Do Keynote Speakers Keep Suggesting That Improving Security Is Possible?"
Well said, I am not a judge, nor do I plan to pour over millions of criminal arrests to make judgment calls (which would negate the purpose of automation)... All we can work with is the data given.
At some point computers will be able to provide better, cheaper, and faster legal advice than humans. No human can fit all of the law in their head, and don't always offer the 100% accurate advice. Not everyone can afford a lawyer.
I'll trust expert systems when I have access to the documented source code of the decision making process after the fact. With modern AI, that's practically impossible because machine learning has replaced manual algorithms in this space ages ago.
Even without AI, companies like these don't like to hand out their source code, so you wouldn't be able to trust them anyway. Programmers are not lawyers, no matter how hard we pretend to be sometimes, and unless every programmer on this project was also a judge I wouldn't trust it to advice or make any judgement. You're not going to find judges who happen to run a startup as a side-gig anyway, so let's just bin the idea.
Automated systems are great at reducing the human factor in a lot of things. For stupid factory work where human minds can be put to much more useful tasks, that's great. For the legal system, no thank you.
Maybe we're going to develop a specialised court system for AI where humans can sue for AI-related injustice. I don't trust companies to self regulate effectively.
I'm not interested in code as much as other forms of human expression. Imagine having to convince the courts that you said something first no matter what the expert AI states.
I still don't see it as an AI failure as a human failure in the use of sophisticated tools.
Yea, scary how much we're allowing courts to simply rubber-stamp "Computer says you're wrong." If a software is going to act as a witness against me in a civil or criminal case, I would want to be able to at least cross-examine that software. Who knows how crappy and non-functional this automatic, blockchain, AI-based time tracking software is?
At my day job, I automate everything because humans can't reliably perform basic tasks.
I'm not a believer in web3 or crypto but believe computers to be more impartial and would rather see them eventually take over certain aspects of legal work.
reply