Samuel Seaman of HunterMaclean: What Not to Feed AI: Contracts and Confidentiality

Samuel Seaman

Friday, September 26th, 2025

Each of us has been exposed to countless news articles and posts on the subject of artificial intelligence (AI). All of them make one thing clear: AI tools are quickly becoming a major part of modern business operations. This article is not intended to weigh in on AI from any particular angle, but to provide some lawyerly practical guidance. 

While these tools offer substantial benefits, they are not without limitations, particularly concerning confidentiality, data security, and regulatory compliance.

Uploading the wrong kind of data to AI platforms can expose your business to a multitude of legal, financial, and contractual risks. Whether you're a business owner or employee, it’s critical to know what not to share when using AI systems.

________________________________________

Third Party Contracts

Much of the data your business handles is protected by contracts like nondisclosure agreements (NDAs), master service agreements, or data use agreements, to name a few. Most businesspeople are firmly aware that certain information is “confidential,” for example, business customer information, other business secrets, individuals’ social security numbers, and protected health information. However, an often overlooked example of confidential information are the actual terms of the contract.

Many confidentiality provisions in a contract or in NDAs include a phrase along the lines of “included in the definition of ‘Confidential Information’ are the terms and conditions of this Agreement.” What this means is that a disclosure of the terms of that contract or NDA is restricted to the same degree as the other confidential information of the disclosing party. 

Contracts may also have other information in them that is confidential or which the disclosing party prohibits sharing, for example, intellectual property that is subject to licensing terms or other protections. 

Uploading contracts to AI tools, even for routine tasks, can breach those contracts. Even partial use of such content could be a violation. 

What’s at risk:

  • Claims for breach of contract

  • Termination of business relationships

  • Regulatory penalties or intellectual property disputes

Tip: Before using AI to analyze or work with sensitive documents, check with legal or compliance teams first.

This is not to say that you have to call a lawyer every time you want assistance in understanding a contract, and that you cannot leverage the value of AI. Rather, this is a suggestion that you be as cautious using AI as you would be with more established technology. For example, you would not post your biggest vendor’s executed master service agreement on LinkedIn. Different AI solutions offer different degrees of security. Companies who permit or encourage their employees to use AI should vet a company-approved solution with appropriate security measures.  

________________________________________

Final Thought: Think Before You Paste

AI tools are not private vaults. They are external systems governed by their own terms of use and data handling practices which may not align with your organization’s internal security standards. If you wouldn’t email the information to an external vendor without approval, don’t upload it to AI without approval.

Bottom line: Misuse of AI can lead to breach of contract, data leaks, and disputes with clients and vendors. Make sure your business and your employees are educated on the risks and understand how to avoid them.

________________________________________

Need help managing the legal risks of AI in your business?

Contact Sam Seaman at 912.236.0261 to discuss how AI, data use, and confidentiality intersect with your legal obligations. You can also visit our Cybersecurity & Data Privacy practice page to learn more.