Artificial Intelligence (AI) has stormed into the legal realm, raising exciting possibilities and some hair-raising questions. Could AI models like ChatGPT and Bard revolutionise the drafting of contracts? Or might they lead to costly errors and legal nightmares? Here’s what we know so far:

New dawn for contract drafting?

Some forward-thinking corners of the legal world are abuzz with anticipation over the potential for AI models like ChatGPT to streamline the contract drafting process. By analysing vast amounts of data and legal documents, AI can in theory draft contracts quickly and efficiently. The automation of routine legal tasks not only saves time and money but also frees up legal professionals to focus on more complex and valuable issues.

In addition, the increasing prevalence of plugins (a software component that integrates AI capabilities into an existing system or platform) means that there is a seemingly unlimited list of use cases for tools like ChatGPT. This means that integrating ChatGPT with commonly used programmes within Microsoft Office or Google Suite is just the tip of the iceberg.

AI advantages: speed and precision

AI holds a distinct edge over human lawyers in terms of speed and data processing capabilities. The ability to rapidly analyse past contracts, precedent, and legislation should in theory allow AI to draft contracts that are both legally robust and tailored to specific requirements. Moreover, the more it learns from its past work and data sources, AI can help identify potential risks, ensuring a more comprehensive approach to drafting.

Weighing the pros and cons

As with any technological advancement – particularly those in the early stages of development – AI-driven contract drafting has its pros and cons.

The obvious pros are:

  • Efficiency: AI models can draft contracts in a fraction of the time taken by human lawyers.
  • Consistency: Automated contract drafting reduces the risk of human error and inconsistencies.
  • Accessibility: AI makes legal services more accessible to the public by reducing costs.

The often-overlooked cons are:

  • Limited understanding: AI models lack the human intuition and judgment needed to grasp the nuances of certain contractual clauses. A contractual provision may be perfect for one party because of the unique way in which it does business, but disastrous to another. Who will discern this and ensure the AI takes this nuance into account?
  • Ethical concerns: Overreliance on AI may lead to ethical dilemmas. Who is responsible for the code which underpins the AI’s output? What biases may humans have passed on to the AI? What assumptions and heuristics will make their way into the drafting (or any other output)?
  • Data security: Many of the commonly used AI platforms use the data input from its users to help further train and develop the AI technology itself. Contracts, particularly those of a confidential nature, may contain information which is commercially sensitive or even personal. How can we prevent the information that is given to AI from getting into the wrong hands? Looking at this issue from another angle, can an AI platform accurately produce the documents we seek, if we’re not able to provide it with the confidential or sensitive information it needs to contextualise our requests?
  • Liability: AI-generated errors could be costly. In that scenario, who is liable for a drafting error?

The road to flawless AI contracts

While AI models continue to improve, the prospect of flawless contract drafting remains a distant goal. AI’s capacity to evolve and learn from human input suggests that, with time, we may see AI-generated contracts that are virtually error-free. However, the ever-changing landscape of laws and regulations makes it unlikely that AI will completely replace human expertise.

In my view, the responsible use of today’s AI in drafting contracts must rely heavily on two things:

  • Expert prompting; and
  • Expert review.
What is prompting?

Prompting, in the context of AI, refers to providing an input, usually in the form of text, to an AI language model, which then generates a relevant response or output based on the given input. Essentially, the prompt serves as a starting point or trigger for the AI model to understand the user’s intent and respond accordingly.

Thorough and accurate prompting is essential when using AI to draft and interpret contracts, as it ensures that the AI model accurately understands the user’s intentions, requirements, and desired outcomes.

The lawyers of tomorrow will likely need to be trained in prompt engineering, because precise and clear prompts will guide AI in generating contract clauses that are legally robust, relevant, and tailored to specific needs. A well-crafted prompt reduces the risk of errors, ambiguities, or omissions, ultimately contributing to a more reliable and efficient contract drafting process.

What is an expert likely to correct in AI-drafted contracts?

The importance of expert human input in the creation of contracts, as opposed to relying heavily on AI, lies in several key factors:

  • Nuance and intuition: Human legal experts possess the ability to grasp the subtleties and nuances of complex legal situations. They can interpret the intentions of the parties involved and use their judgment to draft clauses that accurately reflect those intentions. AI models, on the other hand, may lack the necessary intuition to understand these complexities fully.
  • Adaptability: Laws and regulations are constantly evolving. Human legal professionals have the capability to stay updated with these changes and apply them in drafting contracts. AI models require continuous updating and training, which may not always keep pace with the rapidly changing legal landscape.
  • Ethical considerations: Contracts often involve ethical and moral aspects that require human judgment. Legal professionals can assess the fairness and equity of contractual terms, ensuring that the contract is not only legally valid but also ethically sound. This is especially important where the contract is being negotiated. AI models may not (yet) have the capacity to make these evaluations effectively.
  • Negotiation and communication: Contract drafting is not a one-sided process. It often involves negotiation and communication between parties. Human legal experts can engage in these discussions, understand the concerns of each party, and find mutually beneficial solutions. AI systems may not be able to navigate interpersonal dynamics and build consensus as effectively as humans.
  • Tailoring contracts to specific needs: While AI models can analyse vast amounts of data and identify patterns, they may struggle to account for unique or unconventional circumstances. Human legal experts can tailor contracts to suit the specific needs and goals of the parties involved, ensuring a more customised approach.

Striking the right balance

AI models like ChatGPT present a promising future for contract drafting, with benefits such as increased efficiency, reduced errors, and greater accessibility. However, at least for now, it is essential to strike a balance between embracing these innovations and maintaining a healthy dose of human judgment. As the legal landscape evolves, combining AI-driven technology with human expertise will be crucial to navigating the complex world of contract drafting.