AI Archives - Legal Cheek https://www.legalcheek.com/tag/ai/ Legal news, insider insight and careers advice Tue, 02 Jul 2024 07:45:20 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 https://www.legalcheek.com/wp-content/uploads/2023/07/cropped-legal-cheek-logo-up-and-down-32x32.jpeg AI Archives - Legal Cheek https://www.legalcheek.com/tag/ai/ 32 32 Warfare technology: can the law really referee? https://www.legalcheek.com/lc-journal-posts/warfare-technology-can-the-law-really-referee/ https://www.legalcheek.com/lc-journal-posts/warfare-technology-can-the-law-really-referee/#comments Tue, 02 Jul 2024 07:45:20 +0000 https://www.legalcheek.com/?post_type=lc-journal-posts&p=206395 Harriet Hunter, law student at UCLan, explores AI's impact on weaponry and international humanitarian law

The post Warfare technology: can the law really referee? appeared first on Legal Cheek.

]]>

Harriet Hunter, law student at the University of Central Lancashire, explores the implications of AI in the development of weaponry and its effect on armed conflict in international humanitarian law


Artificial Intelligence (AI) is arguably the most rapidly emerging form of technology in modern society. Almost every sector and societal process has been or will be influenced by artificially intelligent technologies and the military is no exception. AI has firmly earned its place as one of the most sought-after technologies available for countries to utilise in armed conflict, with many pushing to test the limits of autonomous weapons. The mainstream media has circulated many news articles on ‘killer robots, and the potential risks to humanity — however the reality of the impact of AI on the use of military-grade weaponry is not so transparent.

International humanitarian law (IHL) has been watching from the sidelines since the use of antipersonnel autonomous mines back in the 1940s, closely monitoring each country’s advances in technology and responding to the aftereffects of usage.

IHL exists to protect civilians not involved directly in conflict, and to restrict and control aspects of warfare. However, autonomous weapons systems are developing faster than the law  — and many legal critics are concerned that humanity might suffer at the hands of a few. But, in a politically bound marketplace, is there any place for such laws, and if they were to be implemented, what would they look like, and who would be held accountable?

Autonomous weapons and AI – a killer combination?

Autonomous weapons have been a forefront in military technology since the 1900’s – playing a large part in major conflicts such as the Gulf War. Most notably, the first usage of autonomous weapons was in the form of anti-personnel autonomous mines. Anti-personnel autonomous mines are set off by sensors – with no operator involvement in who is killed;  inevitably causing significant loss of civilian life. This led to anti-personnel autonomous mines being banned under the Ottawa treaty 1997. However, autonomous weapon usage had only just begun.

In the 1970’s autonomous submarines were developed and used by the US navy, a technology which was subsequently sold to multiple other technologically advanced countries. Since the deployment of more advanced AI, the level of weapons that countries have been able to develop has led to a new term being coined: ‘LAWS’. Lethal Autonomous Weapons Systems (LAWS)  are weapons which use advanced AI technologies to identify targets and deploy with little to no human involvement.

LAWS are, in academic research, split into three ‘levels of autonomy’ – each characterised by the amount of operator involvement that is required in their deployment. The first level is ‘supervised autonomous weapons’ otherwise known as ‘human on the loop’ — these weapons allow human intervention to terminate engagement. The second level is ‘semi-autonomous weapons’ or ‘human in the loop’, weapons that once engaged will enact pre-set targets. The third level is ‘fully autonomous weapons’ or ‘human out of the loop’, where weapons systems have no operator involvement whatsoever.

LAWS rely on advances in AI to become more accurate. Currently, there are multiple LAWS either in use or in development, including:

  • The Uran 9 Tank, developed by Russia, which can identify targets and deploy without any operator involvement.
  • The Taranis unmanned combat air vehicle being developed in the UK by BAE Systems, an unmanned jet which uses AI programmes to attack and destroy large areas of land with very minimal programming

The deployment of AI within the military has been far reaching. However, like these autonomous weapons, artificial intelligence is increasingly complex, and its application within military technologies is no different. Certain aspects of AI have been utilised more than others. For example, facial recognition can be used on a large scale to identify targets within a crowd. Alongside that, certain weapons have technologies that can calculate the chances of hitting a target, and of hitting a target the second time by tracking movements — which has been utilised in drone usage especially to track targets when they are moving from building to building.

International humanitarian law — the silent bystander?

IHL is the body of law which applies during an armed conflict. It has a high extra-territorial extent and aims to protect those not involved in the practice of conflict, as well as to restrict warfare and military tactics. IHL has four basic tenets; ensuring the distinction between civilian and military, proportionality (ensuring that any military advances are balanced between civilian life and military gain), ensuring precautions in attack are followed, and the principle of ‘humanity’. IHL closely monitors the progress of the weapons that countries are beginning to use and develop, and are (in theory) considering how the use of these weapons fits within their principles. However, currently the law surrounding LAWS is vague. With the rise of LAWS, IHL is having to adapt and tighten restrictions surrounding certain systems.

Want to write for the Legal Cheek Journal?

Find out more

One of its main concerns surrounds the rules of distinction. It has been argued that weapons which are semi, or fully autonomous (human in the loop, and out of the loop systems) are unable to distinguish between civilian and military bodies. This would mean that innocent lives could be taken at the mistake of an autonomous system. As mentioned previously, autonomous weapons are not a new concept, and subsequent to the use of antipersonnel autonomous mines in the 1900s,  they were restricted due to the fact that there was no distinction between civilians ‘stepping onto the mines’, and military personnel ‘stepping onto the mines. IHL used the rule of distinction to propose a ban which was signed by 128 nations in the Ottawa Treaty 1997.

The Marten’s clause, a clause of the Geneva Convention, aims to control the ‘anything not explicitly regulated is unregulated’ concept. IHL is required to control the development, and to a certain extent pre-empt the development of weapons which directly violate certain aspects of law. An example of this would be the banning of ‘laser blinding’ autonomous weapons in 1990 — this was due to the ‘laser blinding’ being seen as a form of torture which directly violates a protected human right; the right to not be tortured.  At the time, ‘laser blinding’ weapons were not in use in armed conflict, however issues surrounding the ethical implications of these weapons on prisoners of war was a concern to IHL.

But is there a fair, legal solution?

Unfortunately, the chances are slim. More economically developed countries can purchase and navigate the political waters of the lethal autonomous weapons systems market — whilst less economically developed countries are unable to purchase these technologies.

An international ban on all LAWSs has been called for, with legal critics stating that IHL is unable to fulfil its aims to the highest standard by allowing the existence, development and usage of LAWS. It is argued that the main issue which intertwines AI, LAWS and IHL, is the question – should machines be trusted to make life or death decisions?

Even with advanced facial recognition technology — critics are calling for a ban, as no technology is without its flaws — therefore how can we assume systems such as facial recognition are fully accurate? The use of fully autonomous (human out of the loop) weapons, where a human cannot at any point override the technology – means that civilians are at risk. It is argued that this completely breaches the principles of IHL.

Some legal scholars have argued that the usage of LAWS should be down to social policy — a ‘pre-emptive governing’ of countries who use LAWS. This proposed system allows and assists IHL in regulation of weapons at the development stage – which, it is argued, is ‘critical’ to avoiding a ‘fallout of LAWS’ and preventing humanitarian crisis. This policy would hold developers to account prior to any warfare. However, it could be argued that this is out of the jurisdiction of IHL which is only applied once conflict has begun — this leads to the larger debate of what the jurisdiction of IHL is, in comparison to what it should be.

Perhaps IHL is prolonging the implementation of potentially life-saving laws due to powerful countries asserting their influence in decision making; these powerful countries have the influence to block changing in international law where the ‘best interests’ of humanity do not align with their own military advances.

Such countries, like the UK, are taking a ‘pro-innovation’ approach to AI in weaponry. This means that they are generally opposed to restrictions which could halt progress in the making. However, it has been rightly noted that these ‘advanced technologies’ under the control of terrorist organisations (who would not be bound to follow IHL) would have disastrous consequences. They argue that a complete ban on LAWS could lead to more violence than without.

Ultimately…

AI is advancing, and with this, autonomous weapons systems are too. Weapons are becoming more advantageous to the military – with technology becoming more accurate and more precise. International humanitarian law, continually influenced by political stances and economic benefit to countries, is slowly attempting to build and structure horizontal legislation. However, the pace at which law and technology are both developing is not comparative and concerns many legal critics. The question remains, is the law attempting to slow an inevitable victory?

Harriet Hunter is a first year LLB (Hons) student at the University of Central Lancashire, who has a keen interest in criminal law, and laws surrounding technology; particularly AI.

The post Warfare technology: can the law really referee? appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/lc-journal-posts/warfare-technology-can-the-law-really-referee/feed/ 1
Lawyers: it pays to be AI savvy https://www.legalcheek.com/2024/05/lawyers-it-pays-to-be-ai-savvy/ https://www.legalcheek.com/2024/05/lawyers-it-pays-to-be-ai-savvy/#comments Wed, 29 May 2024 07:43:22 +0000 https://www.legalcheek.com/?p=205251 Tech skills linked to higher salaries for legal professionals

The post Lawyers: it pays to be AI savvy appeared first on Legal Cheek.

]]>

Tech skills linked to higher salaries for legal professionals


Lawyers with AI skills can expect to tag a premium onto their salaries, new research suggests.

The report, published by Big Four accountancy firm PwC, found that UK lawyer salaries attract a 27% premium where AI skills are required.

This figure was even higher in the US, where lawyers can expect an extra 49% top-up for being tech savvy.

The use of AI within the legal industry is already widespread. MacFarlanes and A&O Shearman have both taken on AI bot ‘Harvey’ to aid lawyers with reviewing, analysing and summarising documents, whilst a second bot, ‘Lawrence’, passed a mock SQE exam.

The 2024 Legal Cheek Firms Most List

Another report published earlier this year by LexisNexis found that over a quarter of lawyers regularly use AI in their practice.

Lord Justice Birss has also been vocal in his support for AI in law, noting that ChatGPT is ‘Jolly useful’, and that AI ‘can be a force for good’.

Not to be beaten, however, law students are also making the most of AI’s legal offerings. A flash poll of 1,303 students by Legal Cheek found that one in five had used AI to help with training contract and pupillage applications, with Shoosmiths going as far as to offer advice to candidates who wish to use tech to aid their applications.

The post Lawyers: it pays to be AI savvy appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/2024/05/lawyers-it-pays-to-be-ai-savvy/feed/ 2
Klarna encourages in-house lawyers to use ChatGPT for contract drafting https://www.legalcheek.com/2024/05/klarna-encourages-its-in-house-lawyers-to-use-chatgpt-for-contract-drafting/ https://www.legalcheek.com/2024/05/klarna-encourages-its-in-house-lawyers-to-use-chatgpt-for-contract-drafting/#comments Tue, 28 May 2024 11:24:02 +0000 https://www.legalcheek.com/?p=205420 Bad news for external law firms?

The post Klarna encourages in-house lawyers to use ChatGPT for contract drafting appeared first on Legal Cheek.

]]>

Bad news for external law firms?


The Swedish fintech company Klarna is encouraging its in-house lawyers to use ChatGPT to save time on drafting contracts.

The company, which provides payment processing services for online businesses, is using an advanced version of the AI tool, ChatGPT Enterprise, to write first drafts of common types of contracts.

Klarna says the tool now “massively” reduces the time it takes its lawyers to draw up contracts, with Selma Bogren, the company’s senior managing legal counsel commenting:

“The big law firms have had a really great business just from providing templates for common types of contract. But ChatGPT is even better than a template because you can create something quite bespoke.”

The 2024 Legal Cheek Firms Most List

Bogren went on to add that “instead of spending an hour starting a contract from scratch or working from a template,” she “can tweak a ChatGPT draft in about ten minutes.”

“You still need to adapt it to make it work for your particular case but instead of an hour you can draft a contract in ten minutes,” the top lawyer said.

Klarna says nine out of ten employees (87%) now use generative AI to assist with their daily work, with usage in the legal department sitting at 86%.

But the rise in adoption of AI tools is not without its problems. Last summer, two lawyers in the US were fined by a judge for using ChatGPT to undertake legal research, leading to non-existent cases being submitted to the court.

Klarna’s strong support for ChatGPT comes after a study found that more than half of lawyers (51%) believe AI should help with legal work. Twenty-four percent said it should not while a quarter were unsure.

The post Klarna encourages in-house lawyers to use ChatGPT for contract drafting appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/2024/05/klarna-encourages-its-in-house-lawyers-to-use-chatgpt-for-contract-drafting/feed/ 3
‘AI will do the heavy lifting so lawyers can do the heavy thinking’ https://www.legalcheek.com/lc-careers-posts/ai-will-do-the-heavy-lifting-so-lawyers-can-do-the-heavy-thinking/ Tue, 23 Apr 2024 10:12:45 +0000 https://www.legalcheek.com/?post_type=lc-careers-posts&p=204185 LexisNexis’ Matthew Leopold discusses its latest AI offering and how it will likely impact the legal industry

The post ‘AI will do the heavy lifting so lawyers can do the heavy thinking’ appeared first on Legal Cheek.

]]>

Ahead of his appearance at LegalEdCon 2024 next month, LexisNexis’ Matthew Leopold discusses its latest AI offering and how it will likely impact the legal industry

Matthew Leopold, Head of Brand and Insight at LexisNexis UK

“My specialism is to take a brand and challenge people’s assumptions about it,” says Matthew Leopold, Head of Brand and Insight at LexisNexis UK. “My job is to get people to feel more positive about a brand and engage with it in a different way.” Having built a specialism in brand management at tech companies, Leopold notes ironically that one of the biggest challenges of his role at LexisNexis is adapting a brand that almost every lawyer already knows. “Not only is it well known, but LexisNexis is a brand which lawyers understand and trust, and that’s because of the wealth of legal content that underpins the technology.”

Having evolved from a single database created by John Horty in 1956, LexisNexis has moved away from traditional publishing by becoming a global player in legal technology. Leopold is keen to stress the company’s tech-y credentials. “We ultimately create solutions that provide the right legal content at the right time; our technology helps people to find the diamond in the haystack of content,” says Leopold who will be speaking at LegalEdCon 2024 in London on 16 May. On the rewards of his role, he says, “it’s really interesting to be able to create cutting edge legal technology with this underlying, incredibly valuable, exceptionally well-trusted legal content.”

In response to developments in generative AI technology, LexisNexis developed and launched its own AI tool, Lexis+AI, at the end of 2023. Available in the UK in the coming weeks, Legal Cheek Careers was keen to ask Leopold for the key features of this new tech. “At launch, there are four main features that Lexis+AI is going to offer,” he tells us. “First is the conversational search feature. Imagine that you have a really knowledgeable associate sat on the desk next to you, and you can ask them a legal question and get a legal answer in response, pointing you in the direction of all the relevant information.” He continues, “the purpose of the conversational aspect of the search, means you can clarify, and ask a follow-up question to which Lexis+AI responds to your requests and refines answers.”

Find out more about LexisNexis UK

Explaining the benefits of this feature in terms of access to legal research, Leopold explains that, “the sorts of conversations that you would usually have with a human, you can now have with AI — and in this context, it allows you to really mine the depths of the law.” Grounded on the already expansive LexisNexis legal database, Lexis+AI can link you directly to relevant precedents, case law and practice notes within seconds. Leopold explains that this is key to reducing AI ‘hallucinations’ — circumstances where AI models produce nonsensical, falsified information. “We can minimise hallucinations as much as possible,” he says, “however linking directly to the content means that lawyers and students can quickly evaluate AI answers with their own eyes”.

The second key feature of this potentially industry-altering tech is its summarisation capabilities. Leopold notes that, “public access AI tools, such as ChatGPT  are not legally trained. They don’t understand the legal use-case for what it’s doing.” The difference with Lexis+AI is that “rather than producing a summary of a case, it presents a case digest which includes jurisdiction, key material facts, controlling law and more.”

LexisNexis will be exhibiting at LegalEdCon 2024 on 16 May

Lexis+AI also boasts drafting capabilities and the ability to upload your own documents for review. “Lexis+AI can help you draft clauses, form arguments, and create letters to clients,” Leopold explains. By integrating the features of the technology, both lawyers and students are able to extract information through conversational search. They can then prompt Lexis+AI to use this information to create legal arguments or letters. “It’s important to emphasise that this will result in a first draft,” Leopold stresses. “We do not proclaim that this is going to be the end result. You would always expect a senior to review the work of junior before it goes to a client. The same is true with AI-generated content.”

In that vein, we ask Leopold how he envisions the future of the legal industry with the introduction of generative AI, and where the boundaries between the lawyers and computers really lie. “AI is the next big frontier,” he says. “There is no avoiding it; it’s a matter of when not if. There are going to be fundamental changes to the legal market., Take the good old billable hour.  It is going to change. In a world where technology can do the heavy lifting of legal research in couple of seconds, the whole idea of charging by the hour becomes difficult to justify.” He predicts that we’re likely going to see an evolution towards value-based pricing in law firms, and more innovative fee structures, as firms transform with the implementation of AI.

“Historically, the legal industry has been a slow adopter of technology,” says Leopold, “This is the first piece of technology that is truly challenging the status quo. Law firms are now considering what this means for their core business and the skills that the lawyers of tomorrow will require. There is a very exciting and busy future ahead for lawyers and the whole legal industry.”

LegalEdCon 2024: Final release tickets on sale now

Following the idea that AI is paving the way for some dramatic shifts in the legal industry, we’re keen to hear Leopold’s thoughts on the differences between the role of a lawyer and the role of AI in legal research. “I think that both are the future, and that one can’t really exist without the other,” he says. “We are very clear that Lexis+AI is not created to replace a lawyer. Lawyers need to still be in the loop because they can identify legal context, and other concepts which cannot be trained into an AI model.” Similar issues are raised, Leopold continues, when one considers the human aspect of legal work, requiring negotiation skills, teamwork and often empathy. Ultimately, AI’s ability to reduce manual, administrative legal tasks is huge, leaving lawyers to focus on problem solving, according to Leopold. “AI will do the heavy lifting so that the lawyer can do the heavy thinking.”

Matthew Leopold, Head of Brand and Insight at LexisNexis UK, will be speaking at LegalEdCon 2024, Legal Cheek’s annual future of legal education and training conference, which takes place in-person on Thursday 16 May at Kings Place, London. Final release tickets for the Conference can be purchased here.

Find out more about LexisNexis UK

About Legal Cheek Careers posts.

The post ‘AI will do the heavy lifting so lawyers can do the heavy thinking’ appeared first on Legal Cheek.

]]>
Beware of ‘deepfake’ clients, regulator warns lawyers https://www.legalcheek.com/2024/03/beware-of-deepfake-clients-regulator-warns-lawyers/ Wed, 13 Mar 2024 07:53:16 +0000 https://www.legalcheek.com/?p=202411 Concerns over money laundering and terrorist financing

The post Beware of ‘deepfake’ clients, regulator warns lawyers appeared first on Legal Cheek.

]]>

Concerns over money laundering and terrorist financing


The Solicitors Regulation Authority (SRA) has issued a new warning about the risk posed by artificial intelligence (AI) to the legal profession in the form of ‘deepfake’ technology.

As part of their regular risk assessments for anti-money laundering and terrorist financing, the SRA has highlighted the potential risks of deepfake technology alongside other emerging and existing issues.

“Not meeting a client face-to-face can increase the risk of identity fraud and without suitable mitigation such as robust identity verification may help facilitate anonymity,” the warning states.

Whilst “not meeting face-to-face may make sense in the context of a given transaction or wider context… where clients appear unnecessarily reluctant or evasive about meeting in person, you should consider whether this is a cause for concern.”

The 2024 Legal Cheek Firms Most List

Firms are also told to be aware of the use of AI to create so-called ‘deepfakes’, which can impersonate a real person’s appearance convincingly.

“This increases the risk of relying on video calls to identify and verify your client. If you only meet clients remotely, you should understand whether your electronic due diligence protects you against this, or to explore software solutions to assist in detecting deepfakes,” the SRA adds.

In a speech last week the second most senior judge in England and Wales, Sir Geoffrey Vos, highlighted the continued growth of AI in the legal profession, and its potential for further expansion.

“One may ask rhetorically whether lawyers and others in a range of professional services will be able to show that they have used reasonable skill, care and diligence to protect their clients’ interests if they fail to use available AI programmes that would be better, quicker and cheaper,” Los said.

Noting also the potential use of tech in judicial decisions, he added:

“I will leave over the question of whether AI is likely to used for any kind of judicial decision-making. All I would say is that, when automated decision-making is being used in many other fields, it may not be long before parties will be asking why routine decisions cannot be made more quickly, and subject to a right of appeal to a human judge, by a machine. We shall see.”

Last month Shoosmiths became one of the first law firms to offer guidance to students on the use of AI when making training contract and vacation scheme applications.

The post Beware of ‘deepfake’ clients, regulator warns lawyers appeared first on Legal Cheek.

]]>
Shoosmiths advises TC seekers on using AI in applications https://www.legalcheek.com/2024/02/shoosmiths-advises-tc-seekers-on-ai-usage-in-applications/ https://www.legalcheek.com/2024/02/shoosmiths-advises-tc-seekers-on-ai-usage-in-applications/#comments Thu, 22 Feb 2024 08:14:07 +0000 https://www.legalcheek.com/?p=201681 Refine original thoughts, not replace them

The post Shoosmiths advises TC seekers on using AI in applications appeared first on Legal Cheek.

]]>

Refine original thoughts, not replace them


Shoosmiths has become one of the first law firms to issue guidance to aspiring solicitors on the appropriate use of artificial intelligence (AI) tools in vacation scheme and training contract applications.

In a recent blog post, the firm’s emerging talent advisor, Laura Hartigan, provides guidance on how prospective solicitors can utilise AI tools to enhance their applications. But she also cautions against merely copying responses generated by bots.

The advice comes amidst what Legal Cheek understands is a rise in the number of students misusing AI tools like ChatGPT when completing law firm applications.

Whilst students are welcome to use AI tools in their initial applications, Hartigan warns that she and her recruitment colleagues “don’t condone simply copying and pasting AI-generated responses”.

“Use AI as a tool to refine and develop your own original thoughts, not replace them”, she says. “Aspiring solicitors must remember that integrity and honesty are fundamental attributes that cannot be replaced by technology.”

The 2024 Legal Cheek Firms Most List

The areas where tools can be of most use, Hartigan suggests, are in aiding with time management and organisation, proofreading answers, and suggesting amendments to draft questions.

Echoing previous advice given by the Solicitors Regulation Authority (SRA), Hartigan also warns students not to “blindly accept AI-generated content without understanding its sources or implications”.

What’s more, “trying to pass off AI-generated content as solely your work undermines your credibility and demonstrates a lack of respect for the application process”, she says.

The blog goes on to urge students to broaden their understanding of AI and its use in the legal field, recommending they attend events, including those run by Legal Cheek, as a prime resource to help with this.

The post Shoosmiths advises TC seekers on using AI in applications appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/2024/02/shoosmiths-advises-tc-seekers-on-ai-usage-in-applications/feed/ 1
Over a quarter of lawyers regularly using AI https://www.legalcheek.com/2024/02/over-a-quarter-of-lawyers-regularly-using-ai/ https://www.legalcheek.com/2024/02/over-a-quarter-of-lawyers-regularly-using-ai/#comments Mon, 12 Feb 2024 08:22:36 +0000 https://www.legalcheek.com/?p=201110 For research, drafting and comms

The post Over a quarter of lawyers regularly using AI appeared first on Legal Cheek.

]]>

For research, drafting and comms


More than a quarter of legal professionals are using AI tools regularly, new research has found.

The research, compiled by LexisNexis, found that 26% of legal professionals are now using generative AI tools in their work at least once a month.

Of those surveyed, 91% thought that AI could be used to assist with drafting, 90% saw a use in researching matters, and 73% saw the new tech as a way to make communication more efficient.

The report, which attracted responses from 1,200 lawyers, also notes that 62% of law firms have made changes to their daily operations because of AI. These include running specialist training for staff, hiring AI experts, developing policies for the use and limits of tech, and providing AI products for lawyers to use.

The 2024 Legal Cheek Firms Most List

Survey respondents were also asked to share their concerns with new tools. Worry over “hallucinations” was prevalent, 57% of those surveyed seeing this as a problem, as were security risks, cited by 55%.

Commenting on the use of AI within the legal profession, partner and chief innovation officer at Baker McKenzie, Ben Allgrove, said:

“It [AI] will change how we practice law. One immediate area of focus is on how we might use it to improve the productivity of our people, both our lawyers and our business professionals. While there are, of course, quality and risk issues that need to be solved, we see opportunities across our business to do that.”

The SRA offered guidance on the use of AI to lawyers at the end of last year, citing both its ability to boost efficiency and reduce costs, and potential risks to privacy and issues of inaccurate information.

The post Over a quarter of lawyers regularly using AI appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/2024/02/over-a-quarter-of-lawyers-regularly-using-ai/feed/ 2
Barristers warned against risks of ChatGPT https://www.legalcheek.com/2024/02/barristers-warned-against-risks-of-chatgpt/ https://www.legalcheek.com/2024/02/barristers-warned-against-risks-of-chatgpt/#comments Thu, 01 Feb 2024 08:54:49 +0000 https://www.legalcheek.com/?p=200749 But Bar Council says AI use not 'inherently improper'

The post Barristers warned against risks of ChatGPT appeared first on Legal Cheek.

]]>

But Bar Council says AI use not ‘inherently improper’


Barristers have been given new guidance by the Bar Council on the use of ChatGPT and other AI systems.

Whilst the guidance states that there is “nothing inherently improper about using reliable AI tools for augmenting legal services”, it emphasises that barrister should exercise caution and carefully consider the numerous risks.

Chief among these potential pitfalls are breaches of confidentiality and privileged information, infringement on IP rights, and information disorder through systems inadvertently generating misinformation.

The Bar Council is also worried about the risks of anthropomorphism, bias and “stereotype reinforcement” on some AI platforms, as well as “hallucinations”. There has already been at least one case in the UK where a litigant in person presented nine legal ‘authorities’, all of which, it transpired, were entirely made up by an AI system such as ChatGPT, the barristers’ body warned.

The 2024 Legal Cheek Chambers Most List

The “irresponsible” use of AI can lead, the guidance goes on, “to harsh and embarrassing consequences, including claims for professional negligence, breach of contract, breach of confidence, defamation, data protection infringements, infringement of IP rights (including passing off claims), and damage to reputation”. It could also result in breaches of professional rules and duties, leading to disciplinary action and sanctions.

Whilst new software can “complement and augment human processes to improve efficiency” the report adds, it “should not be a substitute for the exercise of professional judgment, quality legal analysis and the expertise which clients, courts and society expect from barristers”.

Sam Townend KC, chair of the Bar Council, said:

“The growth of AI tools in the legal sector is inevitable and, as the guidance explains, the best-placed barristers will be those who make the efforts to understand these systems so that they can be used with control and integrity. Any use of AI must be done carefully to safeguard client confidentiality and maintain trust and confidence, privacy, and compliance with applicable laws.”

He continued: “This Bar Council guidance sets out the key risks and considerations and will support barristers using LLMs to adhere to legal and ethical standards” he continued. “It will be kept under review and practitioners will need to be vigilant and adapt as the legal and regulatory landscape changes.”

This new guidance marks the latest in a series of documents issued to lawyers, with judges and solicitors offered advice on the use of AI at the end of last year.

The post Barristers warned against risks of ChatGPT appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/2024/02/barristers-warned-against-risks-of-chatgpt/feed/ 2
Judges encouraged to embrace AI — carefully https://www.legalcheek.com/2023/12/judges-encouraged-to-embrace-ai-carefully/ Tue, 12 Dec 2023 14:33:18 +0000 https://www.legalcheek.com/?p=198601 What could possibly go wrong?

The post Judges encouraged to embrace AI — carefully appeared first on Legal Cheek.

]]>

What could possibly go wrong?


Judges have received new guidance on the use of AI tools in courts. The report, which was produced for all judicial office holders, noted the potential uses of AI, while focusing on the risks that new tech presents.

For summarising or administrative tasks, the guidance states that judges may find AI tools useful. Sir Geoffrey Vos, Master of the Rolls and the country’s second most senior judge, added that AI provides “great opportunities for the justice system”, and the potential to help develop “a better, quicker and more cost-effective digital justice system”.

“But”, he noted, “because it’s so new we need to make sure that judges at all levels understand [it properly]”. “Technology will only move forwards and the judiciary has to understand what is going on. Judges, like everybody else, need to be acutely aware that AI can give inaccurate responses as well as accurate ones.”

In conducting research, the guidance is clear that AI bots are “not recommended”. The information these tools provide, the guidance continues, “may be inaccurate, incomplete, misleading or out of date.” Concern was also shown for AI’s tendency to rely heavily on US caselaw. “Even if it purports to represent English law” the document says, “it may not do so.”

Elsewhere in the guidance there was concern over potential privacy risks, with judges instructed that any detail they give to a public AI tool “should be seen as being published to all the world”.

The use of deepfake technology to forge evidence was also referenced. This was highlighted in another recent report by the SRA, stating that this manufacturing of evidence using AI had already been in contention in UK cases.

Last week, a tax tribunal found as fact that nine cases presented by a litigant in person had unknowingly been fabricated by “an AI system such as ChatGPT”. Whilst ultimately being sniffed out, and having no impact on the case at hand, “providing authorities which are not genuine and asking a court or tribunal to rely on them is a serious and important issue”, the tribunal said.

The post Judges encouraged to embrace AI — carefully appeared first on Legal Cheek.

]]>
How one engineer is helping lawyers build robots https://www.legalcheek.com/lc-careers-posts/how-one-engineer-is-helping-lawyers-build-robots/ Tue, 12 Dec 2023 10:16:30 +0000 https://www.legalcheek.com/?post_type=lc-careers-posts&p=198594 Pinsent Masons technical lead talks all things AI

The post How one engineer is helping lawyers build robots appeared first on Legal Cheek.

]]>

Pinsent Masons technical lead talks all things AI

“I studied engineering at university and decided I didn’t want to be an engineer – but I liked being a student, so I went back and did a Master’s degree. This was all around the time that the internet showed up on the scene, so I knew I wanted to do something related to technology”, recounts Jason Barnes, low code development technical lead at Pinsent Masons.

Barnes joined the firm before it became Pinsent Masons, originally planning to work for a year or two as he thought up an idea for a PhD. But starting off in a general IT position, he was soon able to get involved in designing databases, often for niche legal work, something he found to be “quite good fun”. Subsequently, web applications came along, and that provided another avenue of interest. “I knew straightaway that this was what interested me, so I effectively became a web application developer. We started building web applications for clients and lawyers, and were met with a good degree of success, so we did more work and our team grew”, says Barnes.

Re-evaluating his career trajectory some years later, Barnes decided to move away from a full-on development role to explore the product management side of things. “In short, this involved looking at how software solutions can be implemented to make things easier for people and businesses”. But missing the creativity of being a developer, Barnes started to get involved with no code/low code tools, such as the Microsoft Power platforms, which came on to the market in a big way a couple of years ago. “I got quite excited with these and was convinced that this was a significant technology direction for us as a firm. Nobody else was spearheading this within Pinsent Masons, so I decided to — now I head up our low code team and am back to being a developer!”

The application deadline for Pinsent Masons’ 2024 Vacation Scheme is 13 December 2023

Responding to a question about his day-to-day, Barnes chuckles, saying, “most of the time I have to be stopped — I really do like my job!” He explains that low code tools are designed for non-developers to use and build applications.

“At professional firms, you’ve got, say, a large mass of lawyers who are lawyering and need solutions to help them do this. Now, you can go out to the market to buy these solutions, but for bigger, innovative firms, you want to do this yourself, so you can build exactly what you want. Now, a law firm will only have a certain number of developers, and even they can only do so much when everyone at the firm has an idea they want to see developed. Low code tools can step in and help those people with the ideas to do the development themselves, without having to wait for the developers to do it. So essentially, we’ve got lawyers building robots, although they might not always realise that that’s what they’re doing”.

Barnes sees this as a form of empowerment — with low code tools allowing lawyers to take charge of automating processes and eliminating the frustration of having to wait around for developers to take charge. He does point out, however, that while one can achieve quite a lot with these tools, there’s still some elements that are difficult to navigate, which is where his job steps in, as technical lead of low code development. “We’re there to help the people who have the ideas turn their ideas into solutions”, he summarises.

What’s the typical process through which AI is developed at a law firm like Pinsent Masons? “Despite having worked in technology my whole life, I still always start things off with a pen and paper. If you can’t draw what you want you want to build, then you’re not going to be able to build it”, he responds. Barnes also notes that while lawyers are usually great at articulating what it is they want to build, representing this in a diagrammatic form is often challenging. However, this is at the core of the developer mindset, so we can help with that”, he explains.

Barnes also speaks about the main challenges posed by artificial intelligence (AI) in the legal industry, and he points out that “very few people have a clear understanding of what we mean when we talk about AI”. With the vast majority of people building their views on what they see in the media, most exposure is to generative AI, such as ChatGPT — but that’s only one part of what AI actually is. When I talked earlier about a lawyer having an idea to automate a process, that’s also AI. It’s a computer system doing what a human would normally do. So, one of the challenges is really understanding what it is we’re talking about in the first place”, he explains.

“One of the things at the forefront of everyone’s mind is the protection of client data”, Barnes continues, on the topic of challenges associated with AI in law. “As law firms, nothing matters more than the integrity of our clients’ data — everybody is conscious of the risk of having large language models trained on data sets comprised of client data without prior client agreement”, he notes. On the flipside, Barnes notes that the greatest opportunity for AI in the legal industry is in reimagining the everyday, and taking the monotonous tasks off lawyers hands, so that they are freed up to tap into their human intelligence, to provide better legal services for clients. He offers up document extraction as a tangible example of where AI can have application.

Approaching the end of our conversation, Barnes offers his views on the ‘are lawyers going to be replaced by robots’ debate. “Part of me thinks, well yeah”, he laughs. “A lot of the work I do is around innovation, driving up quality and lowering the cost base. So, taking that to its logical conclusion, we could be looking at a world where we do things artificially across all industries and save a lot of money. But I don’t think anyone wants that”, observes Barnes. While quantifying things in terms of processes and diagrams is easy, and might foretell an automated future, he notes that this ignores the human element of interpersonal relationships which is crucial in the legal space. “I don’t think legal work can be reduced to a collection of ones and zeros”, he concludes.

The application deadline for Pinsent Masons’ 2024 Vacation Scheme is 13 December 2023

About Legal Cheek Careers posts.

The post How one engineer is helping lawyers build robots appeared first on Legal Cheek.

]]>
Solicitors make government list of jobs ‘most exposed’ to AI https://www.legalcheek.com/2023/12/solicitors-make-government-list-of-jobs-most-exposed-to-ai/ https://www.legalcheek.com/2023/12/solicitors-make-government-list-of-jobs-most-exposed-to-ai/#comments Fri, 01 Dec 2023 10:29:37 +0000 https://www.legalcheek.com/?p=198110 Management consultants top table; sports players and roofers least exposed

The post Solicitors make government list of jobs ‘most exposed’ to AI appeared first on Legal Cheek.

]]>

Management consultants top table; sports players and roofers least exposed


Solicitors may wish to consider a dynamic career change to roofing, plastering, or window cleaning according to new predictions on the impact of artificial intelligence (AI).

The latest report, published by the Department for Education, lists solicitors as the 12th most exposed occupation to the impacts of AI. Other “legal professionals” came in higher, taking 9th position.

Topping the table are management consultants and business analysts, financial managers, accountants, and psychologists.

For those now panicking and looking to jump ship, fear not, the report also ranks the occupations least likely to be impacted. Claiming pole position here are sports players, with roofers and elementary construction occupations taking the second and third spots.

The 2024 Legal Cheek Firms Most List

Also on this list are plasterers, cleaners, floorers, launderers, and window cleaners.

Analysing the data, the report goes on to note how: “The occupations least exposed to AI and LLM include many of the same areas, including more manual work that is technically difficult, in unpredictable environments, and with lower wages (reducing the incentive to automate) — with the exception of sports players.”

However, it may not be time to jump into an AI-proof lifeboat just yet. “The exposure score is based on several assumptions including the abilities considered important for a job at a given point in time so rankings should be interpreted with caution, however the themes highlighted by the analysis are expected to continue”.

The post Solicitors make government list of jobs ‘most exposed’ to AI appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/2023/12/solicitors-make-government-list-of-jobs-most-exposed-to-ai/feed/ 11
Could you be fired by a robot – and would UK anti-discrimination law protect you? https://www.legalcheek.com/lc-journal-posts/could-you-be-fired-by-a-robot-and-would-uk-anti-discrimination-law-protect-you/ https://www.legalcheek.com/lc-journal-posts/could-you-be-fired-by-a-robot-and-would-uk-anti-discrimination-law-protect-you/#comments Thu, 30 Nov 2023 07:49:14 +0000 https://www.legalcheek.com/?post_type=lc-journal-posts&p=197920 Cambridge Uni law grad Puja Patel analyses whether current anti-discrimination laws are fit for purpose in the wake of AI

The post Could you be fired by a robot – and would UK anti-discrimination law protect you? appeared first on Legal Cheek.

]]>

Puja Patel, University of Cambridge law graduate, offers an analysis into whether the UK’s current anti-discrimination laws are fit for purpose in the wake of AI


Imagine if popular BBC TV series, The Apprentice, had a robot instead of Lord Sugar sitting in the boardroom, pointing the finger and saying ‘you’re fired.’ Seems ridiculous, doesn’t it?

Whilst robots may not be the ones to point the finger, more and more important workplace decisions are being made by artificial intelligence (‘AI’) in a process called algorithmic decision-making (‘ADM’). Indeed, 68% of large UK companies had adopted at least one form of AI by January 2022 and as of April 2023, 92% of UK employers aim to increase their use of AI in HR within the next 12-18 months.

Put simply, ADM works as follows: the AI system is fed vast amount of data sets (‘training data’) upon which it models its perception of the world by drawing correlations between data sets and outcomes. These correlations then inform decisions made by the algorithm.

At first glance, this seems like the antithesis of prejudice. Surely a ‘neutral’ algorithm which relies only upon data would not discriminate against individuals?

Sadly, it would. Like an avid football fan who notices that England only scores when they are in the bathroom and subsequently selflessly spends every match on the toilet, ADM frequently conflates correlation with causation. Whilst a human being would recognise that criteria such as your favourite colour or your race are discriminatory and irrelevant to the question of recruitment, an algorithm would not. Therefore, whilst algorithms do not directly discriminate in the same way that a prejudiced human would, they frequently perpetrate indirect discrimination.

Unfortunately, this has already occurred in real life — both Amazon and Uber have famously faced backlash for their allegedly indirectly discriminatory algorithms. According to a Reuters report, members of Amazon’s team disclosed that Amazon’s recruitment algorithm (which has since been removed from Amazon’s recruitment processes) taught itself that male candidates were preferable. The algorithm’s training data, according to the Reuters report, comprised of resumes submitted to Amazon over a 10-year period, most of whom were men; accordingly, the algorithm drew a correlation between male CVs and successful candidates and so filtered CVs that contained the word ‘women’ out of the recruitment process. The Reuters report states that Amazon did not respond to these claims, other than to say that the tool ‘was never used by Amazon recruiters to evaluate candidates’, although Amazon did not deny that recruiters looked at the algorithm’s recommendations.

Want to write for the Legal Cheek Journal?

Find out more

Similarly, Uber’s use of Microsoft’s facial recognition algorithm to ID drivers allegedly failed to recognise approximately 20% of darker-skinned female faces and 5% of darker-skinned male faces, according to IWGB union research, resulting in the alleged deactivation of these drivers’ accounts and the beginning of a lawsuit which will unfold in UK courts over the months to come. Microsoft declined to comment on ongoing legal proceedings whilst Uber says that their algorithm is subject to ‘robust human review’.

Would UK anti-discrimination law protect you?

Section 19 of the Equality Act (‘EA’) 2010  governs indirect discrimination law. In simple terms, s.19 EA means that it is illegal for workplaces to implement universal policies which seem neutral but in reality disadvantage a certain protected group.

For example, if a workplace wanted to ban employees from wearing headgear, this would disadvantage Muslim, Jewish and Sikh employees, even though the ban applied to everyone – this would therefore be indirectly discriminatory, and unless the workplace could prove this was a proportionate means of achieving a legitimate aim, they would be in breach of s.19 EA.

But here’s the catch. The EA only applies to claimants from a ‘protected group’, which is an exhaustive list set out at s.4 EA: age, disability, gender reassignment, marriage and civil partnership, pregnancy and maternity, race, religion or belief, sex and sexual orientation.

The Amazon and Uber claimants fall into the protected categories of ‘sex’ and ‘race’ respectively. Therefore, the EA will protect them – in theory. In reality, it is very difficult to succeed in a claim against AI, as the claimants are required by the EA to causally connect the criteria applied by the algorithm with the subsequent disadvantage (e.g. being fired). It is often impossible for claimants to ascertain the exact criteria applied by the algorithm; even in the unlikely event that the employer assists, the employer themselves is rarely able to access this information. Indeed, the many correlations algorithms draw between vast data sets mean that an algorithm’s inner workings are akin to an ‘artificial neural network’. Therefore, even protected group claimants will struggle to access the EA’s protection in the context of ADM.

Claimants who are discriminated against for the possession of intersectional protected characteristics (e.g. for being an Indian woman) are not protected as claimants must prove that the discrimination occurred due to one protected characteristic alone (e.g. solely due to either being Indian or a woman). ‘Intersectional groups’ are therefore insufficiently protected despite being doubly at risk of discrimination.

And what about the people whom are randomly and opaquely grouped together by the algorithm? If the algorithm draws a correlation between blonde employees and high performance scores, and subsequently recommends that non-blonde employees are not promoted, how are these non-blonde claimants to be protected? ‘Hair colour’ is not a protected characteristic listed in s.4 EA.

And perhaps most worryingly of all — what about those individuals who do not know they have been discriminated against by targeted advertising? If a company uses AI for online advertising of a STEM job, the algorithm is more likely to show the advert to men than women. A key problem arises — women cannot know about an advert they have never seen. Even if they find out, they are highly unlikely to collect enough data to prove group disadvantage, as required by s.19 EA.

So, ultimately – no, the EA is unlikely to protect you.

 Looking to the future

It is therefore evident that specific AI legislation is needed — and fast. Despite this, the UK Government’s AI White Paper confirms that they currently have no intention of enacting AI-specific legislation. This is extremely worrying; the UK Government’s desire to facilitate AI innovation unencumbered by regulation is unspeakably destructive to our fundamental rights. It is to be hoped that, following in the footsteps of the EU AI Act and pursuant to the recommendations of a Private Member’s Bill, Parliament will be inclined to at least adopt a ‘sliding-scale approach’ whereby high-risk uses of AI (e.g. dismissals) will entail heavier regulation, and low-risk uses of AI (e.g. choosing locations for meetings with clients) will attract lower regulation. This approach would safeguard fundamental rights without sacrificing AI innovation.

Puja Patel is a law graduate from the University of Cambridge and has completed her LPC LLM. She is soon to start a training contract at Penningtons Manches Cooper’s London office. 

The post Could you be fired by a robot – and would UK anti-discrimination law protect you? appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/lc-journal-posts/could-you-be-fired-by-a-robot-and-would-uk-anti-discrimination-law-protect-you/feed/ 8
‘AI paralegal’ passes SQE https://www.legalcheek.com/2023/11/ai-paralegal-passes-sqe/ https://www.legalcheek.com/2023/11/ai-paralegal-passes-sqe/#comments Fri, 24 Nov 2023 07:48:14 +0000 https://www.legalcheek.com/?p=197673 77% score

The post ‘AI paralegal’ passes SQE appeared first on Legal Cheek.

]]>

74% score


The creators of an AI-powered paralegal say it has successfully passed part one of the Solicitors Qualifying Examination (SQE).

The bot, dubbed ‘Lawrence’, achieved a score of 74% compared to the typical pass rate of between 55% and 65%.

SQE1 is broken down into two Functioning Legal Knowledge (FLK) assessments and covers a broad range of legal topics including contract, tort, property, crime and trusts.

Lawrence was able to successfully answer 67 of the 90 multiple choice sample questions which appear on the Solicitors Regulation Authority’s website. This, according to Lawhive, the lawtech firm that created the bot, demonstrates its “ability to learn, digest and offer considered responses to various legal situations”.

As for the Lawrence’s weak points the creators say that while there were no clear themes, the bot did struggle with questions featuring “complex chains of logic and wider context”. It also struggled when two concepts shared similarities, confusing public nuisance versus private nuisance for example. Law students will sympathise.

The 2024 Law Schools Most List

But Lawrence didn’t just stop at the SQE. He (it?) and a human lawyer were presented with the same client’s will and probate case in order to compare tone, empathy and legal knowledge.

On the AI paralegal’s performance, Lawhive said:

“Whilst Lawrence managed to steer the conversation with the client to gain the necessary information about the client’s late relative’s will and assets, the conversation remained largely transactional and of half the length of the human solicitor. Feedback from the client was positive to both responses, but critiqued Lawrence for not showing as much empathy as the human counterpart. Lawrence also failed to question the client on their late relative’s spending habits and asked around a wider context that ultimately uncovered financial liabilities the solicitor would need to be aware of.”

Don’t worry, though; Lawrence isn’t out of a job just yet. The SQE-passing paralegal is currently being used to support the companies team of solicitors and legal experts.

The post ‘AI paralegal’ passes SQE appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/2023/11/ai-paralegal-passes-sqe/feed/ 13
SRA warns of AI risks – but also recognises benefits https://www.legalcheek.com/2023/11/sra-warns-of-ai-risks-but-also-recognises-benefits/ Wed, 22 Nov 2023 07:38:30 +0000 https://www.legalcheek.com/?p=197524 New report

The post SRA warns of AI risks – but also recognises benefits appeared first on Legal Cheek.

]]>

Confidentiality and accountability among top concerns


A new report by the Solicitors Regulation Authority (SRA) has highlighted the benefits, and risks, of artificial intelligence (AI) in the legal world.

The work, which is the latest in the regulator’s ‘Risk Outlook’ series, cites previous reports which suggest that the use of AI has significantly risen across a spectrum of small, medium, and large firms.

The growth in AI, the report goes on, offers a range of opportunities. These include increases in efficiency for administrative tasks, cost reduction as a result, and an increase in transparency, AI models offering speedy, cost effective, translations.

It’s not all good news, however, with the regulator offering a (longer) list of potential pitfalls. AI models have previously been known to invent cases and precedents, and, with their speed and efficiency comes the risk that any unchecked errors may have a greater impact than individual human mistakes.

This is in addition to concerns over confidentiality and privacy, accountability, and regulatory divergence. The report also notes the criminal potential of AI, with there being at least one case where a party argued that evidence had been falsified through the use of AI.

The 2024 Legal Cheek Firms Most List

In response to the report, Paul Philip, SRA chief executive, said:

“It is difficult to predict how quickly AI will change the legal sector, but increasingly we won’t be able to ignore its impacts. So far it has mainly been larger firms using AI. However, with such technology becoming increasingly accessible, all firms can take advantage of its potential. There are opportunities to work more efficiently and effectively. This could ultimately help the public access legal services in different and more affordable ways.”

“Yet there are risks,” he continued. “Firms need to make sure they understand and mitigate against them — just as a solicitor should always appropriately supervise a more junior employee, they should be overseeing the use of AI. They must make sure AI is helping them deliver legal services to the high standards their clients expect”.

The post SRA warns of AI risks – but also recognises benefits appeared first on Legal Cheek.

]]>
Commercial awareness: What is it and how to get it? https://www.legalcheek.com/lc-careers-posts/commercial-awareness-what-is-it-and-how-to-get-it/ Mon, 20 Nov 2023 11:08:19 +0000 https://www.legalcheek.com/?post_type=lc-careers-posts&p=197434 Good lawyers are the ones who truly understand their clients

The post Commercial awareness: What is it and how to get it? appeared first on Legal Cheek.

]]>

Goodwin Procter’s Ravi Chopra talks AI, ESG and why good commercial lawyers are the ones who truly understand what drives their clients


“It is important for aspiring solicitors to have a general awareness of macroeconomic trends like inflation and geopolitical developments,” says Ravi Chopra, funds partner and UK early careers co-chair at the London office of US law firm Goodwin Procter. “Developing your commercial knowledge will help you connect with clients on a deeper level. You will be able to appreciate what drives them and their business. This will ultimately make you a better, more well-rounded lawyer.”

As a funds lawyer, Chopra often works with fund managers looking to raise capital from institutional investors, such as pension and sovereign wealth funds. This money is then invested in sectors like private equity, technology or healthcare, which is where commercial awareness comes into play.

Chopra explains this with an example. “We are seeing growth for our healthcare clients internationally,” he explains. “One reason for this is that governments may be budget-constrained, or open to private partnerships, and this has opened avenues for alternate providers. The scale, pace and innovation required in this space can be driven by private equity players.”

Chopra also keeps an eye out for the demographic trends that impact the healthcare sector. “Life expectancy has risen considerably in recent years and people want to have the best possible care. Facilities like semi-assisted living spaces, with a mix of healthcare support, catering and leisure amenities, are now increasingly in demand in the UK and the US. We also see increasing demand for life-enhancing, as opposed to life-critical, medical procedures, such as laser vision correction,” he explains.

Another trend is the heightened focus on healthy lifestyle choices, in particular across the younger demographic. “There is a trend to look after themselves. Alcohol and meat consumption seem to be at an all-time low in the UK, while  discretionary spending on healthcare is rising, which is naturally of interest to my clients,” explains Chopra.

How ESG impacts clients

When we speak about topical business trends, Chopra walks me through the impact of environmental, social and governance (ESG) on his practice. “I am talking about ESG with clients a lot more than I used to,” he says. “We are also seeing investors increasingly conscious about making ESG-friendly investments decisions. It is great to see stakeholders across the board approaching the area with good intentions – there’s a real sense that they are adopting ESG because it is a win-win for society as a whole, as well as a pragmatic commercial choice.”

The application deadline for Goodwin Procter’s Spring and Summer Vacation Schemes is 15 December 2023

That said, it is important to ‘rationalise’ the approach towards ESG in transactional work. “Not every investment is going to improve the environment and the impact can vary from industry to industry. If you are buying an office or residential building, for example, there are clear implications around energy efficiency and building materials. This is to be contrasted with something like acquiring a research company where energy consumption and environmental impact may be more limited,” says Chopra. Lawyers must therefore apply a ‘common sense’ approach and tailor their advice to each specific transaction.

Is AI changing the legal sector?

Another major issue impacting the profession is technology — particularly artificial intelligence (AI). Chopra tells me that technology has been “increasingly helpful” for process and document management in his practice. At the same time, it is unlikely that AI will replace what he describes as “the humanity of business relationships”.

“If people are investing in a fund, they are investing in a team of executives who run the fund. They do their due diligence on those people, before entering into a long-term business relationship with them. Legal discussions and negotiation of terms form part of that dialogue, where clients appreciate a high quality and personal approach. There’s no denying that AI can complete certain due diligence tasks efficiently, but there is always a human oversight element involved. Plus, specialist clients often desire a high degree of tailoring, which AI may not be able to offer.”

‘Sector focused approach’

With the application season for vac schemes and TCs in full swing, Chopra also offers guidance on developing a commercial mindset. “I would encourage students to read economic and finance news from multiple sources. The more you read, the more likely it is that you will stumble upon a sector or two that interest you more than others. You can then start focusing on specific developments relevant to these sectors and develop a real depth of understanding.”

So what should your commercial awareness prep look like in practice? Chopra helpfully breaks down the process: “Let’s say you read an article about an innovation or change in a particular sector. Once you’re done reading, take a moment to consider what the challenges that a client could face around such a development are. First, think about how inflation, interest rates, or other macroeconomic developments might impact such a company. Then move on to consider how a client in that sector would respond. With sector-specific news consumption, you might also be able to point to the latest headwinds or best practices that businesses in the industry may be adopting,” he says.

Following this approach, you would be better able to ‘connect the dots’ when you read news pieces. “Consuming information in such a way is an important skill for a corporate lawyer,” says Chopra. “When a client comes to you with an issue, you can appreciate it in the context of both macro and micro-economic trends. Good corporate lawyers go way beyond just completing the tasks at hand. They are the ones who can work the broader context and see what the client cares about. This gives you the texture to be able to discuss the client’s business needs in much more detail and depth. And, of course, it improves your drafting and negotiation and, over time, makes you a better lawyer,” advises Chopra.

The application deadline for Goodwin Procter’s Spring and Summer Vacation Schemes is 15 December 2023

About Legal Cheek Careers posts.

The post Commercial awareness: What is it and how to get it? appeared first on Legal Cheek.

]]>
How the 2008 crash led to my career in financial services regulation https://www.legalcheek.com/lc-careers-posts/how-the-2008-crash-led-to-my-career-in-financial-services-regulation/ Mon, 06 Nov 2023 10:19:36 +0000 https://www.legalcheek.com/?post_type=lc-careers-posts&p=196657 Gowling WLG principle associate Sushil Kuner discusses cross-border work, ESG and her unique value add for clients

The post How the 2008 crash led to my career in financial services regulation appeared first on Legal Cheek.

]]>

Gowling WLG principle associate Sushil Kuner discusses cross-border work, ESG and her unique value add for clients


Starting off our conversation, I ask Sushil Kuner, a principal associate at Gowling WLG, about life in the firm’s financial services regulation practice. “My day-to-day is very varied, and that’s partly due to the small size of our team. What this means is that we deal with a broad range of issues individually. So, rather than having sector specialists within the team, we do a dual role of supporting other practice groups within the firm, as well as working with our own diverse client base,” she tells me.

Kuner goes on to explain that the clients she works with include financial institutions (such as banks and insurers) and asset managers, as well as those who aren’t in financial services themselves, but offer financial products, like large auto manufacturers. “This comes about because during the consumer journey these businesses increasingly offer finance options to customers and are often regulated for consumer-credit related activities. We also act for a range of housing developers where regulatory considerations around things like shared ownership schemes, help to buy and second charge mortgages are often required,” she details.

The benefit of this structure is that it allows a good deal of diversity in terms of the issues Kuner’s able to deal with each day, so things are kept interesting and challenging. “It keeps us on our toes, especially because regulatory is constantly evolving in any case. Think crypto assets, for example, around which there is a fairly new and ever-developing regime,” she points out.

I took Kuner’s mention of crypto as an opportunity to ask for her perspective on the issues that students should keep an eye out for around artificial intelligence (AI), the metaverse and cryptocurrencies. She cautions, “each one of these aspects is a huge area, and moreover, I look at them largely from a financial services regulatory angle. So, keep in mind that you can consider each from a range of perspectives. With AI, for instance, aside from financial services regulatory, two very different, but equally interesting, legal issues surround intellectual property (IP) rights and data privacy,” Kuner notes. Meanwhile, you could be using crypto assets to make purchases in the metaverse, but once again, the financial services regulation is just one facet in a whole array of legal issues that could arise, for example, tax considerations, she points out.

Kuner also went on to speak about her experience of working with the firm’s US and India-based clients and the skills needed to work on matters with a strong cross-border dimension. “Sticking with the theme of crypto assets, the nature of the sector is that businesses operating in this space can be based anywhere in world. Now, if they want to do business with UK consumers, they come to us to ask for perimeter guidance”, she explains. This entails assessing their business model to see if they are conducting activities regulated within the UK, and if so, helping them navigate this process to establish themselves in the UK.

The application deadline for Gowling WLG’s 2024 Summer Vacation Scheme (London and Birmingham) is 22 November 2023

While Kuner acknowledges that a large majority of clients do speak English, the language barrier is, however, not completely eliminated when working with international clients. “The trickiness comes in because you have to be much clearer and more articulate – for instance, when you’re having to break down complex terms that you might be familiar with, but a client isn’t, particularly in other jurisdictions,” she tells me. Understandably, this is a key skill to ensure alignment of interests and objectives, as well as managing client expectations.

Kuner also draws my attention to an additional dimension of cross-border client work at a global law firm — project management. “We’ll sometimes have a US-based client that wants to start doing business in Europe. Now, we make it clear at the outset that we only advise on English law, but the client can then ask us if we can project manage their enterprise. So, once we produce an initial memo based on English law, we would share that with our counterparts in other jurisdictions and seek their legal opinion,” she details. “This enables the client to see how the positions differ between different jurisdictions.”

We then chatted about Kuner’s career journey, she urged the next generation of lawyers not to “pigeonhole themselves early on” and “be open to possibilities”. She qualified as a corporate lawyer in 2007 pre-credit crash; “the wrong time to qualify into corporate,” she tells me. After two years of buoyant activity, with back-to-back completions, she was faced with dwindling work and law firms across the board making redundancies. Kuner decided to move to Canada on a one-year working visa where she joined a Big Four firm’s Vancouver office.

“If there’s one thing I would tell students, it’s to not have tunnel vision and think ‘I’m a lawyer, I can only do a legal job at a law firm’. Seeing the events of the crash unfold really opened up my eyes to financial services — so when I came back after the one-year working visa expired, I applied for a role at the Financial Services Authority, as the Financial Conduct Authority (FCA) was then called,” she details.

While Kuner started off on a six-month placement, she ended up staying for 8 years, moving around various teams. “I wrote key external industry-facing documents in the FCA’s Supervision division, and also spent four years as a case lawyer and lead investigator in its Enforcement division”, Kuner explains. Unsurprisingly, these experiences are now invaluable to her career at Gowling WLG, as her insights into the FCA’s processes give her a unique value-add when it comes to advising clients. “If you haven’t worked at the FCA before, and you’re regulated by it, it can be a scary beast. But because I’ve got that understanding of its strategic priorities and how it makes its decisions, I’m able to bring that added perspective, and it’s certainly something that clients appreciate”, she tells me.

Approaching the end of the interview, I was also curious to get Kuner’s insights on the role played by financial services in Environmental, Social and Governance (ESG) considerations, given the increasing emphasis on these in recent years.

“The regulatory angle on ESG in financial services is huge,” she says. “The UK government has been making it clear since around 2018 that financial services are a key driver in the net zero transition — after all, they help determine where capital is deployed.

Kuner continues: “With that in mind, the regulators have put in place a number of initiatives with respect to disclosure, addressing listed issuers, large asset management firms and big capital owners. This is effectively a whole new disclosure regime which requires these players to report on their climate-related metrics and policies. With investors and consumers being more interested in firms’ ESG policies to ensure that their capital is being steered in a meaningful direction, the role of financial services regulation in relation to ESG is significant.”

The application deadline for Gowling WLG’s 2024 Summer Vacation Scheme (London and Birmingham) is 22 November 2023

Gowling WLG’s Sushil Kuner will be speaking at ‘The Big Commercial Awareness Themes of 2023-24 — with DWF, Goodwin Procter, Gowling WLG, Lewis Silkin, Squire Patton Boggs and ULaw’, a virtual student event taking place THIS AFTERNOON (6 November). Apply now to attend.

About Legal Cheek Careers posts.

The post How the 2008 crash led to my career in financial services regulation appeared first on Legal Cheek.

]]>
How I help clients navigate the world of AI https://www.legalcheek.com/lc-careers-posts/how-i-help-clients-navigate-the-world-of-ai/ Tue, 17 Oct 2023 07:48:42 +0000 https://www.legalcheek.com/?post_type=lc-careers-posts&p=195449 Bird & Bird senior associate Will Bryson discusses his work in the firm’s tech transactions team

The post How I help clients navigate the world of AI appeared first on Legal Cheek.

]]>

Bird & Bird senior associate Will Bryson discusses his work in the firm’s tech transactions team


“I really enjoy negotiating contracts that everyone is happy with,” says Will Bryson, senior associate in Bird & Bird’s tech transactions team. Having initially flirted with the idea of being an IP lawyer, Bryson quickly understood that he preferred the commercial tech space. “For the past few years, we have really been talking about artificial intelligence (AI) for its transformational impact on society. I am very passionate about solving legal challenges that lie at the heart of this change.”

As part of his role in the tech transactions team, Bryson often helps businesses looking to buy technology products. “Our clients do not always understand the complexities of the technological tools that they are acquiring and deploying. This is where tech lawyers step in,” he says.

Applications for Bird & Bird’s 2024 Spring and Summer Vacation Schemes are now open

For example, Bryson’s team recently helped a large consumer goods company in buying an Internet of Things (IoT) platform. Essentially, the client wanted to acquire the full tech stack that would underpin their software across all devices. At the end of this successful deployment, the client’s goal was to improve their functionality via a collective network of devices.

“Our role as lawyers in such transactions can often come in multiple capacities,” Bryson explains. “Often the clients want to adopt AI tools but are worried about its risks. They are unsure of what they can and cannot use the AI for. We help them understand the license and use terms, and work with them to understand the risk profile of the asset. This enables them to make an informed decision about internal use of the tool,” he says. A lot of Bryson’s clients have successfully used generative AI tools for various business functions, including things like marketing and branding.

Once the client decides to incorporate AI within their businesses, they might seek support to procure these tools from suppliers. “For such clients, we are involved in the procurement of relevant technologies. To this end, we would typically negotiate technology contracts between the buyers and sellers, making sure the terms work for our clients,” says Bryson.

Legal challenges with AI

But the negotiation of these technology contracts is far from simple, according to Bryson. There are a plethora of legal issues cutting across different areas of law.

“One of the primary issues is that of fault attribution — i.e., who takes the blame when things go wrong?” he says. “Generative AI like ChatGPT often tend to hallucinate, meaning that they can produce inaccurate or illogical results. The main question that we as lawyers drafting these contracts face is to apportion risks between parties if such events happening. We consider how much pressure can be put on suppliers in terms of warranties and obligations if their AI makes mistakes.”

Another issue around AI fallibility is that of ‘causation’, or tracing the reason behind the technical glitch. Bryson explains this further: “Effectively these models are black boxes. They train on vast amounts of data that will enable AI to make its decisions and predictions, but you cannot tie a particular outcome with a particular input or dataset. Who should accept fault, when no one really knows what caused the problem, is a thorny issue in contractual drafting.”

Applications for Bird & Bird’s 2026 Training Contract are now open

Using ChatGPT wisely

Problem areas do not end here. Generative AI is often questioned from an intellectual property (IP) perspective too. “There are big questions around ownership of the outputs of generative AI systems,” says Bryson. “Parties are used to using contracts to allocate ownership of intellectual property rights in outputs from a service, but where those outputs are created by an AI there may not be any IP to own! If there is nothing to own, what protections should be built into the contract for you is another question that we address for our clients.”

Amidst concerns around data privacy and confidentiality, Bryson is quite hopeful about the future of such tools. “Lawyers have been putting our data into computers and IT systems for decades, so this is not a novel problem at all. It’s about whether we are conducting this exercise in a safe manner,” he says. “There are concerns as to whether the data you feed into the system is being re-used (for example, for further training the system) and so could be disclosed to third parties. Providers of AI solutions clearly recognise this concern and many versions now allow you to ‘opt out’ from your data being reused. This should hopefully take care of some of these confidentiality-related concerns.”

Commercial awareness and careers advice

Ahead of his appearance this afternoon’s Legal Cheek event, Bryson also shares his top tips for students interested in the tech space. “Boosting your commercial awareness is a great way to demonstrate your interest in this area,” Bryson says. “I would encourage students to follow news publications around technology as the landscape changes very quickly. I have built some news reading time into my daily schedule, where I would read from sources like arsTechnica, Financial Times and the Wired magazine. Newsletters like that of Benedict Evans are also a great place to follow interesting trends.”

Alongside developing commercial awareness, Bryson also advises students to be passionate about the field. “Eventually, your enthusiasm is going to shine through at assessment centres,” he says. “Firms love to see candidates who have researched them and know where their strategies lie. But that’s not enough. If you want to really stand out from the crowd, you must also make a case about how your passion and ambition align with that of the firm you are applying to.”

Applications for Bird & Bird’s 2024 Spring and Summer Vacation Schemes are now open

Will Bryson will be speaking at ‘ChatB&B: The Power of AI in Law – with Bird & Bird’, a virtual student event taking place THIS AFTERNOON (Tuesday 17 October). This event is now fully booked. Check out our upcoming fairs and student events.

About Legal Cheek Careers posts.

The post How I help clients navigate the world of AI appeared first on Legal Cheek.

]]>
My journey from paralegal to lawtech expert https://www.legalcheek.com/lc-careers-posts/my-journey-from-paralegal-to-lawtech-expert/ Thu, 12 Oct 2023 08:48:07 +0000 https://www.legalcheek.com/?post_type=lc-careers-posts&p=195142 Addleshaw Goddard’s innovation manager reflects on his path into law and what AI means for lawyers

The post My journey from paralegal to lawtech expert appeared first on Legal Cheek.

]]>

Addleshaw Goddard’s innovation manager reflects on his path into law and what AI means for lawyers


It was an unconventional career journey for Michael Kennedy, senior manager in the innovation and legal technology team at Addleshaw Goddard. After graduating with a law degree from Kent, he stumbled upon a paralegal position at the firm, where he often carried out work that he thought could be done more efficiently with technology or improved processes. This slowly transformed into a full-blown passion for innovation, an area where Kennedy has chosen to stay more than eight years on.

“I remember that the firm was still setting up the innovation team when I was a paralegal and they were looking for more hands-on involvement which is where I came in,” says Kennedy. “My paralegal role later converted to a training contract, by which time I had found my calling for innovation and so I decided to balance both the roles.”

Splitting time between his usual training contract seats and the legal tech work was not straightforward. “At the time, there were three legal trainees at the firm who were also a part of the innovation team. As one of us would do a training seat, the remaining two would stay back and work towards innovating our legal service delivery. This ensured that all of us had a chance to do everything,” says Kennedy. “We essentially looked at the Solicitors Regulation Authority’s (SRA) qualification standards and worked our way backward to ensure that we could complete our training while not compromising on innovation work.”

Applications for Addleshaw Goddard’s 2024 Easter and Summer Work Placements are now open and close on 4 January 2024

Now, of course, the innovation team at Addleshaw is much bigger and is offered as a standalone seat as part of the traditional training contract. The firm also runs a legal technology and innovation scheme, which is a two-year graduate program running parallel to its training contract offering.

Working with lawyers to build legal tech tools

As a senior manager on the innovation team’s research and development division, Kennedy often conducts horizon scans for emerging technologies.

“We research about what is coming up in the commercial world, and what problems we have as a firm that can be improved through technology,” he tells me. “We also chat with clients to learn about their concerns. Once we have this knowledge, we try to leverage technology such as Generative Artificial Intelligence (Gen AI) and machine learning tools, to build products that solve those problems.”

Perhaps unsurprisingly, building legal tech tools often requires collaborating with lawyers across the firm’s full-service offering. When we speak, Kennedy tells me that one of the projects keeping him busy is running team-by-team “ideation sessions” for different practice groups at the firm. “We sit with each legal team and look at some ideas where AI could improve their work. We run them through some demos and then brainstorm whether this could be a fruitful feature to capitalise on.” Once the lawyers are happy with the tool, the ILT team will start the roll out process, building anything necessary and start working with AG lawyers to drive adoption.

Applications for Addleshaw Goddard’s 2024 Easter and Summer Work Placements are now open and close on 4 January 2024

“One tool that we recently delivered on was focused on simplifying advising our clients on changing regulations in the commercial world,” says Kennedy. “Our lawyers proposed a product that would ask clients a series of questions to assess whether their business would be ‘high risk’ or ‘low risk’ in the context of the new financial services regulations. In the product’s development phase, we again took support from specialist lawyers from different areas. If the question pertained to real estate disputes, for example, we would interview a lawyer in that group to frame a particular question for the client.”

Using ChatGPT, but in a secure way

It is not a surprise that the most wide-scale AI application, ChatGPT, has made a significant impact on the legal sector. “The launch of ChatGPT was the most excited we have seen lawyers about the potential of technology in their work,” Kennedy remarks.

He continues to tell me about the firm’s efforts to capitalise on ChatGPT to improve legal advice.

“ChatGPT impacts firms like ours in many ways,” he says. “Since it raises concerns around data privacy and localisation, we quickly drafted an internal firm policy to regulate its usage. We also spoke to our suppliers and vendors in the market to understand how they were using the tool. We then launched our own version of the platform, AGPT, which is based in a much more secure environment, thus allowing lawyers to use it with confidential information pertaining to our clients. At the minute, we have over 150 lawyers that are a part of our working group testing these Gen AI tools, with AGPT now being rolled out firm-wide. The feedback we receive means we can continue making it better equipped to handle our work.”

AGPT helps Addleshaw lawyers in a variety of ways. “It can detect things like red flags in a lease document or the key risks in a share purchase agreement. When you have hundreds of documents, you can simply ask an AI tool specific questions like ‘Is there a limitation of liability in any of these documents?’, and it will condense that search for you.”

Would AI replace lawyers?

But adoption of these functions by legal tech tools might not be enough to render lawyers jobless. “Law is a very human-oriented profession and people skills are at the heart of what we do,” says Kennedy, further explaining that AI can “only supplement, and not replace, the work of lawyers. A lot of what we do is to provide legal advice in very specific contexts and AI cannot assume that function.”

“Where AI can be used is in automating tasks like document review and drafting. If lawyers had to review, say, 400 documents for a large matter, it would take many days. With machine learning, we would run it through a software first which can reduce the search to, say, five or ten relevant documents which lawyers can then give a detailed review. Used in this way, legal tech allows clients to get more value for their money as lawyers spend their time more effectively,” says Kennedy.

Ahead of his appearance at tomorrow’s Legal Cheek event, Kennedy also shares his advice for those interested in applying for Addleshaw’s technologist program. “We are looking for people who are interested and enthusiastic about the work we do. Often, this passion is demonstrated at graduate events where students who are asking smart questions really stand out. Ultimately, we will train you for everything so all we are really looking for at this stage is potential!”

Michael Kennedy will be speaking at ‘Generative AI: opportunities and challenges — with Addleshaw Goddard‘, an in-person student event taking place tomorrow (13 October). Places for this event are now fully booked, but check out our other upcoming events.

Applications for Addleshaw Goddard’s 2024 Easter and Summer Work Placements are now open and close on 4 January 2024

About Legal Cheek Careers posts.

The post My journey from paralegal to lawtech expert appeared first on Legal Cheek.

]]>
Eversheds appoints ‘global head of AI’ https://www.legalcheek.com/2023/10/eversheds-appoints-global-head-of-ai/ https://www.legalcheek.com/2023/10/eversheds-appoints-global-head-of-ai/#comments Fri, 06 Oct 2023 07:50:09 +0000 https://www.legalcheek.com/?p=194679 Introduces AI skills course too

The post Eversheds appoints ‘global head of AI’ appeared first on Legal Cheek.

]]>

Introduces AI skills course too

Eversheds Sutherland has appointed its first global head of artificial intelligence (AI) as the profession continues to embrace technological change.

United Arab Emirates partner Nasser Ali Khasawneh will oversee the firm’s AI strategy, ensuring consistency between its AI client advisory practice and its own use of AI.

His new role will also see him head-up the firm’s newly formed global AI leadership team, made up of lawyers from across the firm’s offices in the UK, Ireland and the US as well as its consultancy service Konexo.

Khasawneh has represented some of the world’s largest information technology, media and consumer companies, advising them on various commercial, licensing, cloud computing and IP rights, according to his firm profile. He also spent four years as a lawyer at Microsoft.

Separately, Eversheds has also announced the launch of a new AI skills programme for all lawyers and business staff. The first stage of this program will be delivered through the new ‘Generative AI Fundamentals for Law Firms’ training developed by e-learning outfit SkillBurst.

This is in addition to the creation of a global AI task force, featuring a team of lawyers and business professionals from across the firm who will be reviewing the potential development and use of AI products.

The 2024 Legal Cheek Firms Most List

Commenting on his new role, Khasawneh said:

“I am honored to take on this very exciting new role as Global Head of AI. AI is without a doubt the most significant development in the technology space for a generation. This technology doesn’t belong to one geography, sector or practice group — my appointment will ensure that the firm takes a global approach in helping our clients consider the rapidly developing potential offered by generative AI.”

His appointment follows the news that Macfarlanes had adopted ‘Harvey‘, an AI bot that uses ChatGPT technology to “automate and enhance” various aspects of legal work. The bot is also being used by lawyers at Allen & Overy.

The post Eversheds appoints ‘global head of AI’ appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/2023/10/eversheds-appoints-global-head-of-ai/feed/ 5
Macfarlanes adopts AI bot ‘Harvey’  https://www.legalcheek.com/2023/09/macfarlanes-adopts-ai-bot-harvey/ Fri, 22 Sep 2023 07:06:55 +0000 https://www.legalcheek.com/?p=193964 Follows A&O

The post Macfarlanes adopts AI bot ‘Harvey’  appeared first on Legal Cheek.

]]>

Follows A&O

Macfarlanes has become the latest City law firm to adopt AI chatbot ‘Harvey’ following a successful pilot earlier this summer.

Harvey uses ChatGPT technology to “automate and enhance” various aspects of legal work such as reviewing, analysing and summarising documents. Essentially, lightening the load for lawyers so they can, at least in theory, focus on more meaty legal matters.

The bot — which is backed by the OpenAI Startup Fund, a fund which invests in startups, specially AI companies — can also answer general legal questions and even turn its hand to drafting.

All of Harvey’s outputs are “carefully monitored and reviewed” by lawyers, Macfarlanes says.

As reported by Legal Cheek, Allen & Overy was the first major law firm to adopt the tool earlier this year, while several other big legal players have reportedly expressed an interest in bringing Harvey on board.

Commenting on the firm’s adoption of the AI tool, Macfarlanes’ head of lawtech and chief knowledge & innovation officer, Chris Tart-Roberts, said:

“The potential for generative AI in law lies in augmentation; to support lawyers to do elements of their job better and smarter, benefiting users of legal services via improved efficiency and enhanced service. We are excited to be at the forefront of this technology’s evolution, which has potential to shift the paradigm. Partnering with Harvey provides a unique chance to be a part of the development of transformative AI.”

Last month Legal Cheek reported on research which found that three-quarters of UK lawyers believe AI will lead to an uptick in the amount of legal work undertaken by those without “traditional legal qualifications”.

The post Macfarlanes adopts AI bot ‘Harvey’  appeared first on Legal Cheek.

]]>
‘Jolly useful’: Court of Appeal judge’s verdict on ChatGPT https://www.legalcheek.com/2023/09/jolly-useful-court-of-appeal-judges-verdict-on-chatgpt/ https://www.legalcheek.com/2023/09/jolly-useful-court-of-appeal-judges-verdict-on-chatgpt/#comments Mon, 18 Sep 2023 13:16:39 +0000 https://www.legalcheek.com/?p=193779 Helps with legal research

The post ‘Jolly useful’: Court of Appeal judge’s verdict on ChatGPT appeared first on Legal Cheek.

]]>

Helps with legal research

A Court of Appeal judge has admitted using ChatGPT to help him prepare a recent judgment.

Lord Justice Birss labelled the AI programme “jolly useful”, citing its “real potential” for future use within the legal sector.

In a speech held at the Law Society, and subsequently reported on by The Law Gazette, Birss LJ is quoted as saying: “I think what is of most interest is that you can ask these large language models to summarise information. It is useful and it will be used and I can tell you, I have used it.”

The Cambridge grad, and former material scientist, did, however, clarify that he only used the AI programme after researching the area of law. “I know what the answer is because I was about to write a paragraph that said that”, he told the audience.

Speaking of responsibility for the contents of the judgment, the top judge went on:

“I’m taking full personal responsibility for what I put in my judgment, I am not trying to give the responsibility to somebody else. All it did was a task which I was about to do and which I knew the answer and could recognise as being acceptable.”

The 2023 Legal Cheek Chambers Most List

But Birss LJ isn’t the first judge to employ the AI tool. As reported by Legal Cheek, a judge in Colombia hit headlines earlier this year when he used ChatGPT to assist him in solving a dispute between a health insurance company and the guardian of an autistic child.

Elsewhere, a US judge recently issued a joint fine to two lawyers involved in a case where non-existent cases were submitted to the court after ChatGPT was used for legal research.

The post ‘Jolly useful’: Court of Appeal judge’s verdict on ChatGPT appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/2023/09/jolly-useful-court-of-appeal-judges-verdict-on-chatgpt/feed/ 4