In September 2024, a venture capital firm used Otter.ai to record a Zoom call with a founder. After the meeting, Otter accidentally emailed the founder a transcript that included hours of private investor conversations — discussions that happened after he had left the call. The deal was cancelled.

That incident was a preview of what's now unfolding across the AI meeting tool industry. In 2025 and early 2026, class action lawsuits have been filed against Otter.ai, Fireflies.ai, and Microsoft Teams. Universities are banning these tools. Companies are blocking them at the firewall level.

If you're using cloud-based AI meeting recorders — or letting others use them in meetings with you — here's what you need to know.

TL;DR

  • Otter.ai — facing class action for recording without consent + using data to train AI
  • Fireflies.ai — sued for collecting voiceprints without BIPA-required consent
  • Microsoft Teams — February 2026 lawsuit alleges illegal voiceprint collection
  • Stanford, Oxford, Tufts — have banned third-party AI meeting bots
  • Local-first tools — avoid these issues entirely by keeping data on-device

The lawsuits

Three major AI meeting tools are now defending themselves in federal court. The allegations are serious: recording without consent, collecting biometric data illegally, and using private conversations to train AI models.

Ongoing

Brewer v. Otter.ai (August 2025)

Filed in California federal court, this class action lawsuit alleges Otter.ai "deceptively and surreptitiously" recorded private conversations without obtaining proper consent from all participants.

The core issue: when someone with an Otter account joins a meeting, the bot can auto-join and record without asking non-account holders for permission. The lawsuit also alleges Otter used recorded conversations to train its AI models.

Claims: ECPA, CFAA, California Invasion of Privacy Act (CIPA), Unfair Competition Law

BIPA Violation

Cruz v. Fireflies.AI (December 2025)

An Illinois resident filed this class action alleging Fireflies.ai illegally collects biometric voiceprints — unique vocal characteristics used to identify speakers.

The complaint states Fireflies collects this data from "people who never created a Fireflies account, never agreed to its terms of service, and never gave written consent." The company also allegedly has no published policy on how long it retains biometric data.

Claims: Illinois Biometric Information Privacy Act (BIPA) — up to $5,000 per violation

BIPA Violation

Microsoft Teams Class Action (February 2026)

Five Illinois residents sued Microsoft in February 2026, alleging Teams' real-time transcription feature creates "voiceprints" by analyzing pitch, tone, and timbre to identify speakers.

The lawsuit argues Microsoft should have "categorically informed users how the data would be used and how long it would be stored" — which it allegedly did not do.

Claims: Illinois BIPA — seeking $1,000-$5,000 per violation

The legal minefield

These lawsuits highlight a fundamental problem with cloud-based meeting recorders: they're collecting sensitive data in ways that may violate multiple laws.

Two-party consent states

Thirteen US states require all parties to consent before a conversation can be recorded:

California, Connecticut, Delaware, Florida, Illinois, Maryland, Massachusetts, Michigan, Montana, Nevada, New Hampshire, Pennsylvania, Washington

In California, the California Invasion of Privacy Act (CIPA) allows victims to sue for up to $5,000 per violation. When an AI bot auto-joins a meeting without explicit consent from every participant, it may be violating this law.

Biometric data laws

Illinois BIPA is the strictest biometric privacy law in the US. It requires:

Voice is biometric data. When AI tools analyze your voice to identify who's speaking (speaker diarization), they're creating a voiceprint — which BIPA regulates. Simply clicking "I agree" to general terms of service doesn't satisfy BIPA's requirements.

Data training concerns

The Otter.ai lawsuit specifically alleges that meeting transcripts were used to train AI models. Even if you consented to transcription, did you consent to your private conversations becoming training data?

As one security researcher put it: "The primary risk is your sensitive business or customer data being leaked or used to train large language models."

Universities and companies are banning AI bots

Institutions aren't waiting for the lawsuits to play out. They're blocking these tools now.

Institution Action Tools Blocked
Stanford Blocked from Zoom Fireflies, OtterPilot, Sembly, Grain, Avoma, Fathom
Oxford Blocked + SSO disabled Read AI, Fireflies AI, Sembly AI
Tufts Blocked from Zoom + Teams All unapproved AI bots (only Zoom AI Companion allowed)
Chapman University Banned Read AI

Stanford's IT department explicitly warns that third-party bots may "scrape calendars, unknowingly transcribe meetings, save meetings in unknown places, and join meetings even when users are not present."

Corporate IT departments are taking similar action. As one Gartner peer community discussion noted: "Most companies' policies are to not have these capabilities enabled until they understand the data risks posed."

The compliance problem

For certain industries, cloud-based meeting transcription is a non-starter:

The alternative: local-first recording

The core problem with cloud-based AI meeting tools is architectural: your audio leaves your device, goes to someone else's servers, and you lose control over what happens to it.

Local-first tools solve this by keeping everything on your computer:

This is why we built Mono to process everything locally. It records audio from any app — Zoom, Teams, Meet, Discord, WhatsApp — but the transcription happens entirely on your device using on-device AI. Your meeting audio never leaves your computer.

It's not the only approach (you could also just... not transcribe meetings), but if you need transcription and care about compliance, local processing is the only architecture that makes sense in 2026.

What you should do

If you're currently using cloud-based AI meeting tools:

  1. Check your company's policy. IT may have already blocked these tools without announcing it.
  2. Review the consent flow. Does the tool get explicit consent from every participant, or just the host?
  3. Check the privacy policy. Is your data used for training? Is there a data retention policy?
  4. Consider your jurisdiction. Are you in a two-party consent state? Are any participants?
  5. Evaluate local alternatives. If compliance matters, local processing eliminates most of the risk.

The AI meeting tool industry moved fast and broke things — specifically, privacy laws. The lawsuits now working through the courts will establish precedents, but you don't have to wait for the verdicts to protect yourself.

Record meetings without the legal risk

Mono transcribes locally using on-device AI. No cloud upload. No bots. No consent issues. $50 once, no subscription.