I have been testing Natural Language Processing (NLP) and LLM tools for legal redlining and contract search for quite a while now and have seen many different companies offer many different solutions. My conclusions, based on everything I’ve used or seen, are:
- GenAI specifically can only help legal with admin work
- Contractual drafting is an accuracy-based work and GenAI is not good with nuance and accuracy
- Core legal functions like interpretation of legal language, legal strategy, complex drafting and negotiations are not tasks that should be outsourced to AI (and AI is poor at doing them)
My main argument here is that how GenAI is used for legal work is heavily dependent on the importance of the task. Below, I outline how I see GenAI being integrated into a legal professional’s life and what the pros and cons of it are:
Full Integration - Administrative Tasks
Full integration of AI should be possible for tasks like document filing, organization and search; basic formatting and text editing; simple template generation (e.g., NDAs); and automatization of archives (e.g., archived documents to be automatically put in the correct part of the archive and renamed based on the title of the document).
Document Filing, Organization and Search
Finding the relevant document can sometimes take up hours of our time as legal professionals, and this is one area where GenAI can add the most value. By integrating GenAI capabilities in the contract repository, for example, it would be possible to find a document based on a few keywords. This would be a much more advanced search function, where a legal professional can type a prompt to look for a contract with a specific context, and the respective GenAI tool could find it in seconds. The same applies to emails, Microsoft Teams communications, etc., but this is not specific to legal professionals and can be applied across departments.
Here, the risk stemming from the GenAI tool is fairly low because it automates a task that would not allow for too many hallucinations. The tools would search for documents and clauses in a specific repository, and legal professionals would then see the exact clause in the respective contract. This means a human would double-check for truthfulness by default. The same goes for document organization—lawyers should be able to frame their prompt to “filter” what kind of agreements or clauses are searched. For example, asking the tool to find all clauses that allow contractual notices to be provided via email results in a list of all documents where notices can be given via email. The issue in this use case is connected to confidentiality, privacy and security. AI companies that provide such tools should be able to show their clients that all the data they are provided with is secure, controllable by the client and kept confidential.
Basic Formatting and Text Editing
Providing correct headings, checking for grammar errors, fixing small formatting errors and generally making text easier to read and more structured is a simple, straightforward use case for something like Microsoft Copilot. For lawyers, this is one of the most time-consuming and least profitable and meaningful tasks that can be outsourced to software, be it GenAI or simple functionality in tools like Microsoft Word. The question here is whether the added value of GenAI for these tasks is enough, as tools to help with text editing, grammar, etc., have existed for quite a while and are more mature technologies that people are more used to. Furthermore, issues around data security and confidentiality should be considered here, as any GenAI tool needs a lot of access, and there are AI companies that want to use all data provided to them to train their own models.
Simple Template Generation
Standard, low-complexity documents like NDAs can be compared to a company’s internal standards, and the GenAI tool can provide information about the differences and, based on the internal company playbooks, whether they are problematic at all. The same applies to generating simple addenda, annexes, etc. With a fairly simple prompt, lawyers can get a ready-made document that requires a little editing, rather than starting from scratch. The risk here is automation bias and relying too much on the AI tools for creating contractual language that may need to be very specific. The easy solution is to apply critical thinking and never fully rely on AI. However, that’s sometimes easier said than done, so organizations need to be careful when implementing tools in that space and need to provide quality training to employees to mitigate the risk.
Automatization of Archives
This is again tied to GenAI’s search capabilities, as well as sorting capabilities. For example, a company may use AI to automatically archive all documents that are X years old in a separate folder within the repository to save employees from having to do it and simplify documentation tracking for audit purposes. Also, AI could help with searching and creating reports from sections of contracts, so legal professionals can quickly and easily access important contractual language.
The risk here comes from (1) hallucinations and incorrect citations and (2) unseen mistakes that influence the AI tool’s “decisions” on which contracts to archive. Keep in mind that hallucinations are not going away, so a human should check periodically which contracts have been archived and whether it was done correctly. Regarding unseen mistakes, the garbage-in, garbage-out principle applies here, and if the quality of the contracts or the contracts repository is low, using AI can be a catastrophe. So, before implementing a solution like that, companies and law firms would need to make sure their data is high-quality, clean and appropriate for AI usage.
Hybrid Tasks - Human-AI Collaboration
While administrative tasks can be largely automated, a middle ground exists where AI assistance proves valuable but cannot operate independently. Hybrid tasks—tasks like legal research, initial review of GTCs, template creation for more complex documents, automating updates of templates and policies—are a good example of this. There, the human remains the driving force but is aided by AI.
Transparency and Reasoning Requirements
Critical for each of these tasks is transparency of the AI’s “reasoning.” We need to be able to understand what sources were used in research, so we can double-check them, as hallucinations are very prevalent in that sphere. The structure of any templates provided to an LLM must be explained in the prompts very carefully and at length. The same applies to explaining why certain clauses have been flagged in the initial, human GTCs review used as training for any GenAI “agent” or “Copilot.”
Legal Research
For research, the most important thing is source citations. It has to be hybrid work because of the many errors AI tools make, but it can be useful in identifying internal documents. For wider legal research, it can also be useful in providing an overview of the issue and where we should delve deeper. However, the work involved in gaining a deep understanding of the articles and legal issues captured in the research is very important and should never be outsourced to AI.
Initial Review and Redline of General Terms and Conditions (GTCs)
One good use case for GenAI is for initial review and redlining of GTCs. I have tried that many times now, but AI tools are very bad at understanding context, nuance and risk, so a legal professional needs to always fully read the documents. However, AI tools can be useful in providing a general idea of how balanced/unbalanced a contract is, so that resource allocation for more complex issues can be done based on that. Some initial redlines based on a company’s or law firm’s playbooks may also be helpful, but hallucinations and the aforementioned lack of contextual understanding may be problematic.
Template Creation for Complex Documents
The main advantage of AI for template creation is structure creation. If trained well enough and on good data, an AI tool can be very useful for creating structured templates and checking whether you there are missing elements in agreements customized based on the template. For example, when reviewing a client contract, AI can compare it to a standard and say what is lacking or superfluous. Drafting specific clauses still has to be performed by humans, but the general structure can be done by AI.
Policy Reviews and Updates
If trained on good resources, AI can aid policy reviews by providing relevant information on new government regulations that may be unclear or missing in a policy. For example, in the case of a new regulation like the AI Act, the AI tool can check whether the company has taken into account this new regulation within its policies and propose amendments. Then, the policy owners can review based on the AI tool’s feedback. However, the risks of hallucinations here are still prevalent, and there are other tools based on NLP or machine learning that can be more helpful there.
Automatic Template Updates for Regulatory Changes
In the event of regulatory changes, changes can be automated in templates using GenAI tools. For example, the Standard Contractual Clauses by the EU Commission that are used for Data Processing Agreements were updated a few years ago. A GenAI tool can assist data privacy professionals by automatically making updates to those SCCs in the company’s standard DPAs. Then, a legal professional can verify whether the update was made correctly. Again, human labor is aided—not substituted. At the end of the day, lawyers and privacy professionals must confirm everything is OK. The issues resulting from this specific GenAI use case are again hallucinations and automation bias (i.e., the lawyers trusting the LLMs too much for “simple” things and not proofreading properly). the lawyers trusting the LLMs too much for “simple” things and not proofreading properly).
Pure Legal Tasks - No AI Integration
There are three main reasons not to use GenAI for any of the processes that follow (and this list is not exhaustive): accountability, automation bias and cognitive debt.
Lawyers have to be accountable for the work they do, and if they outsource important and core parts of the legal profession to GenAI, they will have trouble down the road. This is evident in the many cases where judges have caught lawyers citing fake precedents and court decisions.
Automation bias can also easily become a huge problem, as demonstrated by the number of fake cases making their way into court. This points to lawyers trusting GenAI tools too much and not proofreading bot output like they should. Since the text they produce can sound very convincing, a lot of lawyers will allow their judgment to be clouded by LLM outputs.
Lastly, studies have shown that cognitive debt is already impacting students, workers and most people who use GenAI. The combination of automation bias and stress can lead to overreliance on GenAI tools and a significant lowering of analytical and critical thinking abilities. Multiple studies are showing that GenAI tools help generate quicker results and, in that sense, boost productivity. But the quality of the results is often lacking, and the people using the tools do not develop any cognitive abilities when relying on LLMs and typically will not recall the information they get from GenAI.
For these reasons, important parts of legal work, like the examples cited below, cannot and should not be combined with AI usage:
Legal Strategy Development
Legal strategy development includes the creation, fine-tuning and expansion of a specific plan on how to provide the legal services to your client in the best possible way. Some examples include drafting a complex contract based on specific client requirements within a particular context or preparing arguments and interpretations of laws before and during court cases. Tasks connected to such strategic decisions are not good use cases for AI, as GenAI will often deliver generic responses and miss nuances. The specific and relevant content of policies, contractual agreements, EULAs and other similar documents must be drafted, extensively reviewed and produced mainly by lawyers. Otherwise, accountability becomes difficult to track, hallucinations may have significant negative consequences and lawyers’ core skills can be severely impacted by automation bias and critical thinking decline.
Final Interpretation of Laws and Regulations
Final interpretation of laws and regulations, as well as precedents and contracts, is the main cognitive work of lawyers who have trained for years to do it effectively. Nuances will be missed by AI, so outsourcing this work to GenAI tools may cause significant issues for lawyers. Teams that work on GDPR, the AI Act, etc., must continue their work on those regulations and laws without the interpretative support of GenAI to ensure high quality and explainability of the results, as well as accountability.
Complex Negotiations
Complex negotiations need to be handled only by lawyers. Even using AI for assistance can be troublesome, as its advice is usually very generalized—even if provided with a specific prompt— may not be suitable for the specific situation and may cause confusion and overreliance. LLMs do not understand nuance well—and in negotiations, nuance is key. Even when summarizing conversations, the tools may often focus on unimportant things or overlook important ones. Furthermore, while they may be useful in transcribing conversations, that administrative task is only a small part of the whole negotiation process.
Drafting Complex Contract Clauses
If lawyers need to draft a complex clause during negotiations, it needs to be created and verified by humans. In some instances, small checks, including those for grammar, may be performed by AI, but complexity is not handled well by AI tools. That final detailed human check at the end of the drafting process is necessary.
Conclusion
In essence, the most important part of a legal professional’s work is interpreting text and understanding its value in a much broader context of business, regulations, laws and ethics. This main job is accompanied by a myriad of tasks that support it: administrative work, communication, drafting, reading and researching. Most of these additional processes can be aided to some extent by GenAI tools. The question every legal department and law firm needs to ask before turning to AI tools is whether they are worth the investment and are going to meaningfully impact the lawyer’s day. If the tools can prove that the risk of using them is warranted by a higher quality of work, then the investment makes sense. If, however, there are already existing tools that can help or if the GenAI systems will not have a positive effect on everyday work, then the decision to invest requires more scrutiny. The negative effects of GenAI, such as automation bias and cognitive debt, must also be considered when deciding whether or not to integrate GenAI into a lawyer’s workflow.
To learn more about how others are using AI to support their work and solve complex business challenges in unique ways, read our blog.
Petko Getov
Petko Getov is a technology lawyer at Progress and has over 7 years of experience in multinational IT operations. Previously supported DXC Technology's business across the DACH region and its AI Office, analyzing AI contractual provisions and drafting governance frameworks for responsible GenAI deployment. In his role at Progress Software, Petko is focused on supporting vendor contracting globally, advising sales and product teams on customer transactions and product terms and actively participating in AI governance and implementation initiatives.
Petko holds dual master's degrees in Law (Humboldt University Berlin) and Public International Law (Leiden University), complemented by IAPP AI Governance Professional certification.
Petko's areas of expertise include: AI governance and ethics with IAPP certification, enterprise IT contract negotiation, EU AI Act and technology regulation, intellectual property in AI contexts and cross-border legal risk assessment with a specialization in the DACH region.
When not navigating the complexities of AI regulation and enterprise contracts, you'll find Petko on the tennis or basketball court, socializing with friends or traveling (especially to Spain).
Contact Petko on: LinkedIn, petko.getov@proton.me; petko.getov@progress.com