If there was a court case on whether society should embrace artificial intelligence (AI) or reject it, there would likely be a hung jury.

People can’t agree on whether the good things about using AI, like doing tasks automatically and sorting through lots of information quickly, are more important than the issues with unfair information and not being completely accurate or responsible.

AI could be a mixed bag for the legal profession. On one hand, it might mean fewer jobs for people, as mentioned in a report from the UK’s Law Society in 2021. On the other hand, a recent study from universities in Pennsylvania, New York, and Princeton suggested that the legal field is most likely to be affected by AI.

At the same time, AI can play a hugely valuable role in researching and putting cases together. Although there is precedent for things going horribly wrong.

New York lawyer Steven Schwartz found himself facing his own court hearing this year, when he used popular AI system ChatGPT to research precedents for a case involving a man suing an airline over personal injury. Six of the seven cases he used had been completely made up by the AI.

Many law firms might be hesitant to adopt these systems, but Ben Allgrove, who is in charge of innovation at the law firm Baker McKenzie, sees it differently. He believes it’s not just about the technology, but about the behavior and values of lawyers. He thinks it’s important to address the issues of professionalism and ethics before focusing on whether Mr. Schwartz should have used the tool in the first place.

AI is already being used by some law firms

Since 2017, Baker McKenzie has been following and studying advances in AI. They’ve even formed a team made up of lawyers, data scientists, and data engineers to try out the new systems that are being introduced to the market.

Mr. Allgrove believes that most of the AI used in his company will come from upgraded versions of existing legal software, such as LexisNexis and Microsoft’s 365 Solution for Legal, which now have AI capabilities.

In May, LexisNexis introduced its AI platform. This platform can provide answers to legal questions, create documents, and give summaries of legal matters. On the other hand, Microsoft’s AI tool, Copilot, will be available for business customers next month. It’s an extra feature for those using 365, but it comes at an additional cost.

“We currently use LexisNexis and Microsoft, and they will soon have enhanced features thanks to generative AI. If these additions are practical and reasonably priced, we’ll consider getting them.”

Generative AI is a type of AI that’s getting a lot of attention. It’s capable of making text, images, and music based on the information it learned.

However, the downside is that right now, the advanced, paid versions of these tools can be quite costly. Just getting Microsoft’s Copilot, for example, would double our spending on technology, according to Mr. Allgrove.

Another option for law firms is to pay less for access to AI systems not specifically designed for the legal field, like Google’s Bard, Meta’s Llama, and OpenAI’s ChatGPT. The firms would adapt these platforms for their own legal needs.

Baker McKenzie is currently trying out several of these options to see how well they perform. This testing is really important, Mr. Allgrove explained, because all these systems will have some mistakes.

One legal software system called RobinAI uses what they call an AI co-pilot to speed up the process of creating and checking contracts. This is used by both legal teams in big organizations and individuals.

They mainly use an AI system made by a company called Anthropic, which was started by a former leader at OpenAI and has support from Google.

However, RobinAI has also made their own AI models that are learning the details of contract law. Every contract used in the system is uploaded, labeled, and used as a learning tool.

This means the firm has gathered a huge database of contracts. Karolina Lukoszova, who helps lead the legal and product side at RobinAI in the UK, believes this will be really important for using AI in the legal field.

Karolina Lukoszova’s firm, RobinAI, uses both bought in, and its own, AI

Companies will have to teach their own smaller AI models using their own data. This will give them more accurate results that are kept private.

To ensure accuracy, RobinAI has a team of human lawyers working with the AI.

Alex Monaco is a lawyer who manages both his own legal practice and a tech company called Grapple.

Grapple was created to give the public what Mr. Monaco calls a “guide to employment law”. It offers advice on various workplace issues like bullying, harassment, and redundancy. It can create legal letters and provide summaries of cases.

He’s enthusiastic about how AI can make the legal profession more accessible.

“About 95% of the inquiries we receive are from people who can’t afford lawyers,” says Mr. Monaco.

But now, with free AI tools available, people can build their own legal cases. Anyone with internet access can use Bard or ChatGPT to help write a legal letter. While it might not be as good as a letter from a lawyer, it’s still free.

“AI isn’t taking over for humans, or lawyers. Instead, it’s boosting people’s understanding and use of their legal rights,” he says.

He adds that in a world where everyone is using AI, this could be very important.

“Companies and corporations are using AI for hiring and firing. They are profiling CVs, using AI for restructuring, mass redundancies and so on. They’re using this against the average employee.” While the use of AI in law is very much still at an early stage, some systems are already facing their own legal challenges. DoNotPay, which dubs itself as the world’s first robot lawyer, offering to fight parking fines and other citizen cases using AI, has been hit with a range of lawsuits, the latest of which accuses the firm of practising law without a license. Meanwhile, as a result of Steven Schwartz’s case, several senior judges in the US now require lawyers to disclose whether AI was used for court filings. Mr Monaco thinks this will be both difficult to define and police. “Google uses AI within its search algorithm, and now it’s using Bard. So even by googling anything, you are already using AI to do your legal research.” please paraphrase in simplest English and arrange it orderlyChatGPT

Big companies are starting to use AI for hiring and firing employees. They’re using it to evaluate CVs, make decisions about reorganizing and laying off many workers, and more. Sometimes, this can work against regular employees.

Although the use of AI in law is still new, some systems are already facing legal challenges.

DoNotPay, which claims to be the first robot lawyer globally, helps with things like fighting parking tickets and citizen cases using AI. It’s currently dealing with a number of lawsuits, including one that says the firm is practicing law without the proper license.

Because of Steven Schwartz’s case, some high-level judges in the US now want lawyers to say if they used AI for court documents.

Mr. Monaco thinks this will be hard to define and enforce.

“Google uses AI in its search system, and now it’s using Bard. So when you search something on Google, you’re already using AI for your legal research.

SOURCE:BBC

Leave a Reply

Your email address will not be published. Required fields are marked *