The explosion of generative AI technology across the legal space has bewildered contract management teams. Many are concerned about the security risks of integrating the technology into existing contract management programs while others question whether or not current generative AI platforms are trustworthy. But even those who are hesitant to use the technology are fearful they will get left behind if they don’t start integrating generative AI applications into current workflow processes and day-to-day operations.
“The reality is, in the context of contract management systems, teams that fail to use generative AI in a meaningful way will be slower than their counterparts, incur more costs, and add less value to the overall organization,” says Pramata CEO Praful Saklani. “By not capitalizing on the numerous benefits that generative AI can provide, contract management teams are putting their entire business at risk of falling behind their biggest competitors.”
According to Praful, generative AI is truly a gamechanger for the legal industry, particularly contract management professionals. It is the ultimate self-service tool for legal departments, enabling in-house counsel teams to easily share business-critical contract data with finance teams, sales teams and procurement teams.
With so many legal professionals questioning how best to use generative AI within their contract management programs, we had Praful answer some of the most pressing questions Pramata has received from our community of general counsels and in-house legal teams.
Generative AI Applications within Contract Management Systems: A Conversation with Pramata CEO Praful Saklani
Why is clean contract data crucial to generative AI applications and the value they can deliver?
If a company wants to get real value from generative AI, it’s important to understand clean data’s role in generative AI outputs. The way to look at it is in terms of the questions—or prompts—used to retrieve a response or output. Generative AI is at its best when asked precise questions or when asked to create content on precise topics.
For example, if you ask ChatGPT what a good standard is for a limitation liability clause, it will likely source your response based on generic data it pulls from the large language model that powers it—it may be a good idea or it may not. But, if you were to input three of your own limitation of liability clauses into the generative AI platform—and identified which one you considered a high risk, a medium risk, and a low risk—then you could ask the generative AI to analyze a new contract using the risk criteria you have entered, it will deliver a much stronger, more accurate response that is able to tell you whether the limitation of liability in the new contract comes with a high, medium, or low risk.
By instructing the generative AI how to assess contract clauses, it will be able to give you a much more useful answer. Therefore, getting real value out of generative AI requires having clean data that can serve as a playbook derived from your existing contract data.
Clean data is crucial because it’s not the actual words within the contract, but the logic underneath the paragraphs that you really care about. Ultimately, the more precision you can use when working within a generative AI platform, the more value you will get from the technology. That’s what we’re talking about when we talk about clean data and generative AI’s ability to provide useful and actionable content.
What steps should a contract management team or in-house counsel take to ensure they build a clean data repository?
It’s imperative that any contract management program, especially one that wants to leverage generative AI, needs to start with a clean set of existing contracts. The information contained within these existing contracts will serve as the foundational data that powers generative AI outputs.
As we’ve already made clear when discussing clean data, the more precise you can be with the information you put into the generative AI system—either by giving it a historical view of what you have done or how you do things—the more useful it will be. What does a “clean” foundational data set mean? To start, there should not be any contract drafts or irrelevant emails that were part of the contract negotiation process entered into the system. Your contract data repository should exclude any extraneous materials that diminish the integrity of your contract data.
Another key step to building a repository of clean contract data is the creation of a playbook for various types of contracts. A playbook can be used to define how you negotiate contract terms, various types of information you want the generative AI to analyze, and how you want it to be analyzed, for example what your team considers a low risk, medium risk, or high risk liability clause.
If you have a clean foundational data set in place, along with playbooks that can train the generative AI technology to align with your contract management programs, you will be surprised at just how invaluable the technology becomes. Not only will it be able to deliver high quality outputs, but it will apply to various use cases that accelerate your contract management programs and democratize your contract data—making it accessible and actionable across the organization.
What would you say to General Counsels and legal teams that are concerned about the security risk of generative AI platforms?
If you’re using a popular generative AI application, such as OpenAI’s ChatGPT or Microsoft’s Azure, you need to ensure that none of the information that you put into the system can be used to train their system. Depending on which generative AI platform you’re using, there are very straightforward tactics you can implement to protect confidential information or vulnerable data.
OpenAI’s ChatGPT solution allows users to disable their chat history so that any information used to generate a response will not appear in the user’s history nor will it be used to train OpenAI’s models. Other platforms offer the same security measures and offer the option to destroy chat histories at the end of a user session so that none of the information entered into the system during the user session is saved.
All of the major players in the generative AI space understand just how critical security is for enterprise organizations and are bending over backwards to ensure users can eliminate any data retention or persistence. More importantly, if you’re working with a vendor or partner that has launched a generative AI application, make sure you are able to configure the application so that you are operating in your own sandbox that keeps your data separate from the large language model underpinning the technology.
When speaking with General Counsels and other legal professionals, what are their biggest concerns and questions about the technology?
The three key themes that keep showing up in my conversations are: 1. Security concerns, which we’ve addressed; 2. The trustworthiness of the tech, which is resolved when using the technology with your own clean set of data, and; 3. How much work the technology will eliminate.
This last concern can be a positive or negative depending on the person you’re talking to. The negative view is that generative AI will altogether remove the need for attorneys or legal operations teams. Industry analysts are still questioning whether the technology will completely replace legal professionals—which, in my opinion, is a resounding no. But, as the technology stabilizes over time, it will reduce the amount of busy work legal professionals perform on a day-to-day basis by as much as 80% to 90%.
Implementing generative AI will be akin to having unlimited time from a first-year associate at zero cost. The work will be completed in minutes instead of hours or even days. Sure, you will have to review the work and edit the finished product, but generative AI can get you to 80% to where you want to go.
It is reminiscent of the early days of NASA when all the computational work was completed in spreadsheets. Engineers spent a lot of their time computing data using math tables because the technology to compute the data did not exist. The dawn of the computer age didn’t eliminate the need for engineers, it did just the opposite—it empowered engineers to do things that added even more value. That is what generative AI will be able to do for legal teams.
Do you believe generative AI will disrupt the CLM industry?
Traditional contract lifecycle management platforms have such a low level of customer satisfaction because they are often complicated to implement and difficult to maintain. Designing new templates and creating new workflows and approval processes can be especially challenging, especially if you’re part of an organization that does a lot of contract negotiation or uses a lot of other-party paper.
Also, contract management workflows and approval processes tend to be very dynamic. Companies undergo reorgs all the time, which in turn leads to new workflows and approval processes. If your team has to re-engineer and redesign your contract management program rules and approval processes every time a reorg happens, that can be a very painful process within an overly complicated CLM. It’s hard to design; it’s hard to implement; and, it’s really, really hard to maintain and keep up to date.
With generative AI, templates, playbooks and approval logic can be written in common English, and when things change the logic can be re-stated in common English. Therefore Generative AI can make all of these things 90% easier. That is why technology is such a major game-changer for our industry.
How does Pramata’s integration of generative AI set it apart from other contract management platforms ?
Pramata’s philosophy has always been centered on simplicity. We take a radically simple approach to contract management, meaning our solution enables contract management teams to simplify workflow processes. Whether that means being able to seamlessly share critical contract data with other business units like sales, finance or procurement or the ability to generate new contracts. Our platform simplifies every aspect of contract management. Add to this a repository of clean contract data—an integral component of Pramata’s solution—and you can leverage generative AI to its maximum capability within the contract management space.
It’s the reason Pramata’s generative AI capabilities are so impactful. Contract management teams can quickly analyze existing contracts to create clean playbooks on mitigating risks. They can design commonsense playbooks and use them to build new templates or create new contracts without the need for any complex engineering or the help of an outside consultant. It allows you to create simple approval rules in plain English or build new templates based on previously agreed-upon contract terms with customers.
With Pramata’s generative AI functionality, you can leverage your existing contract repository to create seamless workflows and accelerated approval processes. Contract creation and template building becomes more dynamic than ever with an intuitive user interface that democratizes contract management programs across the organization.
Pramata is the fastest way for contract management teams to leverage generative AI because our contract management platform is the fastest way to organize your complex contract data.
Watch Pramata’s Generative AI Assist in action here.