AI note-taking is today’s topic du jour. The first thing to know about this topic is that you need to experience it for yourself to see why it’s so attractive. (I’ll run a blog soon summarizing how AI note-taking works in practice.)
It’s truly amazing. No more “spacing out” while attending a long, drawn-out panel. No more daydreaming while “listening” to a significant other. You can now rely on your AI companion to step in and make you look like a hero.
But there are risks with AI note-taking, and you need to understand how AI works in order to manage your own use of these tools – and be able to explain them to your senior managers and directors. You don’t want them to get into hot water because they didn’t understand the technology.
So, as the voice of reason, you should be aware of a few things:
- Be selective. The meeting host, or an attendee with permission, should decide for each meeting whether AI note-taking is appropriate. Don’t default to “always on” – you might end up with a written record that you don’t want. Think about the level of confidentiality that’s required, whether the topic or type of meeting is one that could be subject to litigation discovery in the future, and how your company’s document retention policy will apply if the meeting is summarized in writing.
- Bake AI into your board meeting compliance warnings. As note-taking should be banned from board – and board committee – meetings (as you’ll learn below), you’re going to need to periodically remind directors, senior managers, independent auditors and anyone else attending these meetings that they can’t take notes using AI (or otherwise, of course) from their phones or laptops.
This warning goes neatly with your insider trading, confidentiality and Section 16 compliance reminders. - AI compliance and etiquette when taking notes. There will be times that note-taking using AI can be useful outside of the board meeting context, and this is where guidelines and etiquette play a role. Whether it’s an internal or external meeting, you should ask permission of those in the meeting if it’s okay for you to use AI to take notes. Especially if the meeting is contentious in any way, various call recording laws may give a disgruntled participant a way to raise issues.
There are exceptions to this, such as taking notes at a conference that you’re attending that isn’t using Chatham House rules. It’s a little bit of a fine line when asking permission is appropriate (for example, small Zoom group meetings and in an interview with a candidate to work in your department if sensitive material isn’t being discussed) and when it’s unnecessary. Often, you can just announce that AI note-taking is being used or that the meeting is being recorded. Use your intuition to guide you here.
Note that if people you’re talking to know that AI is listening and transcribing, they may self-censor or avoid candid discussions. So AI note-taking can have a chilling effect that human note-taking likely won’t induce.
And, if you’re using a third-party AI tool – not an enterprise version that is “closed” within your company only – that greatly increases the risk that whatever is said during the note-taking is now in the public domain. Even with a “closed” AI system, someone within your company will need to vet the AI vendor thoroughly to ensure that their product is aligned with your company’s compliance, retention and data governance practices.
You should also know that many companies are updating their policies to ensure AI records are addressed and people are trained in the best use cases as technology advances. Some are even implementing AI note-taking policies. You may need to be the one to raise this and consider whether it’s needed, if it’s not something that your company is already working on.
Here are the principal reasons why AI should not be used to take notes during board meetings:
1. Confidentiality and data security: Board meetings involve highly sensitive information. If the AI tool is cloud-based or relies on third-party providers, there’s a risk of data breaches, unauthorized access or compliance issues under cybersecurity regulations and company policies. It’s important to use enterprise versions of tools rather than consumer-facing versions, which tend to have fewer protections.
2. Accuracy and context sensitivity: AI might not fully understand nuance, tone or context, particularly in complex, high-stakes board discussions. It could misinterpret sarcasm, strategic ambiguity or off-the-cuff remarks and record them literally, creating a misleading or overly formalized version of the conversation.
3. Attorney-client privilege risks: Parts of board discussions may be protected by attorney-client privilege. It’s unclear at this point whether privilege is waived if an AI tool captures and stores these conversations without appropriate safeguards. Don’t take the risk of turning a protected conversation into discoverable evidence.
4. Lack of discretion in editing: Human note-takers know what not to include. AI might record verbatim or too much detail, which can create a more extensive record than intended. This can backfire if board minutes are ever scrutinized in litigation or regulatory inquiries.
5. Regulatory and litigation exposure: Anything recorded could become discoverable. Overly detailed or inaccurate AI notes could create risk in securities litigation, shareholder derivative suits or SEC enforcement inquiries. Think of it like your boardroom has its own court stenographer – with no filter.
Authored by

Broc Romanek