7 AI Meeting Notetakers That Don't Train on Your Data (And Why It Matters)

A guide for privacy-conscious organizations evaluating AI transcription tools.

By
The Meetingnotes Team
|
14
mins
|
January 28, 2026
Tools

AI meeting assistants have become indispensable for modern teams. They transcribe conversations, generate summaries, extract action items, and save hours of manual work. But this convenience comes with a critical question that many organizations overlook: What happens to your meeting data after it's processed?

For many AI tools, the answer is troubling. Your confidential business discussions, strategic plans, customer information, and sensitive HR conversations may be used to train the very AI models that competitors and other companies use. A class action lawsuit filed in August 2025 against Otter.ai brought this issue into sharp focus, alleging the company secretly recorded and used private conversations to train its AI models without proper consent.

Whether you're in healthcare, legal services, finance, or any industry handling sensitive information, knowing which AI notetakers protect your data is no longer optional. Here's what you need to know.

Why data training policies matter more than ever

When an AI tool trains on your meeting data, several risks emerge:

Data leakage: Large language models can sometimes reproduce or reveal patterns from their training data. If your confidential discussions are part of that training set, there's a risk that sensitive information could surface in outputs generated for other users.

Competitive exposure: Your strategic discussions, product roadmaps, and business intelligence could theoretically inform AI responses to your competitors using the same service.

Compliance violations: For organizations subject to HIPAA, GDPR, or other regulatory frameworks, having meeting data used for AI training may violate data protection requirements.

Loss of control: Once your data becomes part of a training dataset, you typically cannot request its removal or know how it's being used.

The Otter.ai lawsuit: A wake-up call for enterprises

In August 2025, a federal class action lawsuit was filed against Otter.ai in the U.S. District Court for the Northern District of California. The case, Brewer v. Otter.ai, alleges that Otter secretly recorded private conversations and repurposed them to train its machine learning models. Source: Best Law Firms

According to NPR's coverage of the lawsuit, the plaintiff Justin Brewer claims his privacy was "severely invaded" after discovering Otter had secretly recorded a confidential conversation (NPR). The suit alleges that Otter's transcription tool joins Zoom, Google Meet, and Microsoft Teams meetings as a participant and transmits conversations to Otter in real time, often without obtaining affirmative consent from all attendees.

The lawsuit points to several concerning practices: Otter allegedly sought permission only from meeting hosts (and sometimes not even them), while other participants could not disable the tool. According to the complaint, Otter allegedly retains conversational data indefinitely and leverages it to refine its speech recognition technology without participant permission. (Fisher Phillips)

The implications extend beyond Otter. This case has prompted enterprises across industries to scrutinize the data practices of all their AI meeting tools.

Universities and enterprises tighten controls

The Otter lawsuit is part of a broader trend. In February 2025, Harvard University issued new guidance stating that AI meeting assistants should not be used in Harvard meetings, with the exception of approved tools with contractual protections (Harvard University Information Technology). The university cited potential legal and data security risks, warning that meeting data could be exposed to third parties or used by companies when training future AI models.

7 AI meeting notetakers that don't train on your data

Not all AI notetakers use your data for training. The following tools have explicit policies stating they do not use customer meeting content to train AI models. However, policies can change, so always verify current terms before making a decision.

1. Fellow

Data training policy: Fellow is a secure AI meeting notetaker that never trains its AI models on your meeting data. Your conversations remain private and are never used to improve Fellow's AI. Fellow has employed special provisions to ensure that data is not retained for training or any other purposes by their LLM providers.

Security certifications: SOC 2 Type II, HIPAA compliant, GDPR compliant

Notable features: Supports recording with or without a visible bot, granular recording policies, transcript redaction tools, regional data residency options, integrations with 50+ tools including CRM, Automation, and Project Management tools

The New York Times Wirecutter named Fellow the best service for transcribing and summarizing meetings in 2025.

Best for: Enterprise teams requiring strict data governance, healthcare and legal organizations, companies with compliance-heavy requirements and high-accuracy needs

Website: fellow.ai

2. Fireflies.ai

Data training policy: Fireflies states that meeting content, including audio, video, transcripts, and summaries, is never used to train any AI models. They enforce a Zero Data Retention policy, meaning no storing, no accessing, and no training on data by third-party vendors after processing.

Security certifications: SOC 2 Type II, GDPR compliant, HIPAA compliant (with BAA for enterprise)

Notable features: Extensive integrations (including CRMs), supports 69+ languages, private storage options for enterprise customers, Rules Engine for automated data governance.

Best for: Teams needing extensive CRM and workflow integrations, organizations requiring multi-language support.

Website: fireflies.ai

3. Fathom

Data training policy: Fathom uses de-identified customer data to improve the accuracy of their proprietary AI models, but users can opt out in Settings, and Organizations on Team Edition can opt out all users on their account.

Security certifications: SOC 2 Type II, HIPAA compliant, GDPR compliant

Notable features: Generous free tier with unlimited recording and transcription, strong CRM integrations with Salesforce and HubSpot, Zoom-focused experience.

Best for: Small teams and startups seeking cost-effective solutions, sales teams needing CRM sync, Zoom-centric organizations.

Website: fathom.video

4. tl;dv

Data training policy: tl;dv explicitly states that your recordings and transcripts are yours, not theirs, and they will never use them to train AI. They partner with Anthropic to deliver secure generative AI, anonymizing all metadata and processing meetings in randomized, small chunks to prevent unauthorized access.

Security certifications: SOC 2 certified, GDPR compliant, data stored in ISO 27001-certified data centers with AES-256 encryption

Notable features: Video-focused summaries with timestamped highlights, 6,000+ integrations, sales coaching playbooks (BANT, MEDDIC), EU-hosted AI options available.

Best for: Customer-facing teams, sales organizations needing coaching tools, European companies with data residency requirements.

Website: tldv.io

5. Read AI

Data training policy: Read AI states they never use meeting content for training unless users explicitly, knowingly opt in.

Security certifications: SOC 2 Type 2, GDPR compliant, HIPAA compliant

Notable features: Meeting optimization with engagement scores and sentiment tracking, speaker coaching insights, visible and transparent meeting participation with easy participant controls.

Best for: Teams focused on meeting effectiveness and coaching, organizations wanting real-time engagement analytics.

Website: read.ai

6. Krisp

Data training policy: Krisp processes audio locally on the device for its core noise cancellation features. For meeting summaries, Krisp relies on Microsoft Azure services, and Microsoft does not use customer data for internal training. Krisp also offers a "hard-delete" option to permanently remove transcripts upon request.

Security certifications: SOC 2 Type II, PCI DSS certified, GDPR compliant, HIPAA compliant

Notable features: Industry-leading noise cancellation, bot-free operation (works at system level), works with any meeting platform or audio application, automatic audio deletion after processing.

Best for: Privacy-conscious individuals, professionals in noisy environments, teams that want transcription without visible bots joining meetings.

Website: krisp.ai

7. Granola

Data training policy: Granola trains on anonymized data by default, but any user can opt out in Settings. Enterprise workspaces have training off by default and org-wide controls. Notably, Granola is bot-free and transcript-only, meaning it never creates audio or video recordings.

Security features: Local audio processing (audio never leaves your device for transcription), no recording storage, enterprise controls for organization-wide settings

Notable features: Completely bot-free (no visible participant), desktop-based live transcription, customizable note templates, lightweight and unobtrusive.

Best for: Individual professionals who want meeting notes without any visible AI presence, users who prioritize a minimal footprint.

Website: granola.so

What to ask before choosing an AI notetaker

Before implementing any AI meeting tool, ask these questions:

1. Is my meeting content used to train AI models? Look for explicit statements, not just implications. "We use data to improve our services" often means training.

2. What about third-party AI providers? Many tools use OpenAI, Anthropic, or Google's APIs. Confirm whether those providers have access to your data and what contractual protections exist.

3. Can I opt out of data training? Some tools offer opt-out options. Verify whether this is available and whether it's applied by default or requires action.

4. How long is my data retained? Shorter retention periods reduce exposure risk. Look for tools that allow you to set custom retention policies.

5. What security certifications does the tool have? SOC 2 Type II is the baseline for enterprise. HIPAA compliance is essential for healthcare. GDPR compliance matters for any organization with EU connections.

6. How does the tool obtain consent from meeting participants? Tools should notify all participants, not just the host. Some jurisdictions require all-party consent for recording.

The bottom line

The Otter.ai lawsuit has fundamentally changed how organizations should evaluate AI meeting tools. Data training policies are no longer a nice-to-have; they're a critical factor in vendor selection.

The tools listed here have taken public stances against using customer data for AI training. But policies can evolve, and the burden is on organizations to verify current practices before implementation. Read privacy policies carefully, ask vendors direct questions, and consider working with legal counsel to understand your exposure.

Your meetings contain some of your organization's most valuable and sensitive information. Choose tools that treat that information with the respect it deserves.

Never take meeting notes again

Record, transcribe and summarize your meetings with Fellow.

Get started with Fellow todayStart a free trial

Got something to contribute?

Become a contributor, and add your unique take on these topics to our website.
Become a contributor