A new lawsuit filed in Illinois federal court highlights the legal risks associated with AI-based meeting assistants that offer transcription and speaker identification services on platforms like Zoom and Microsoft Teams. The complaint, filed by Illinois resident Katelin Cruz, alleges that California technology company Fireflies.AI Corp. illegally collects and stores biometric voice data of individuals without their knowledge or consent.
According to Cruz’s complaint, Fireflies.AI’s meeting assistant allegedly recorded, analyzed, transcribed, and stored the unique voice characteristics (i.e., “voiceprints”) of each meeting participant. This includes people who have never created a Fireflies account, never agreed to its terms of service, and never given written consent for their biometric data to be collected.
Cruz argues that voiceprints are sensitive biometric identifiers, often used to authenticate access to personal or financial information. If compromised, they pose a significant risk of identity theft and fraud.
Fireflies.AI promotes its software as having “speaker recognition” capabilities; in other words, the technology can distinguish between different people speaking in a meeting by analyzing the unique voice of each participant. The AI assistant often joins meetings automatically when activated by the host, running in the background and assigning statements to each speaker.
According to the complaint:
- Fireflies’ privacy policy allows it to collect and process “meeting data” and “derivatives” and store meeting recordings in its systems.
- The company says it will only delete personal information when it believes it is “no longer necessary” or if a user makes a specific request, but it does not give a clear retention period.
Cruz says that during a virtual meeting in November, her voice was recorded and analyzed, even though she never consented, created an account or interacted with Fireflies.AI in any way. Yet the software still built a profile based on his voice and attributed statements to him in the meeting transcript.
She also points out that Fireflies’ terms of service theoretically only apply to registered users or those who click “I agree.” For non-recorded meeting participants, no written consent is ever obtained nor are they informed that their voiceprints might be collected.
Additionally, Cruz alleges that Fireflies.AI does not publish a policy regarding how long it retains biometric data or when it destroys old data, which is a clear requirement of the Illinois Biometric Information Privacy Act (BIPA).
Cruz is seeking to file a class action lawsuit, potentially representing anyone whose speaker biometric data was collected by Fireflies.AI without consent. It asks the court to award statutory damages, injunctive relief and other costs, highlighting the broader impact on consumers’ privacy rights in the era of AI-based services.
Illinois’ BIPA is among the strictest biometric privacy laws in the United States, requiring written notice and explicit consent before a business can collect, store, or use biometric data such as fingerprints, retinal scans, or voiceprints. It also imposes public policies regarding the retention and destruction of data.
This case could have far-reaching consequences not only for Fireflies.AI, but also for any AI meeting tool provider that records and processes personal voice data, especially if it does so without clear disclosure or user consent.
As remote work and virtual meetings remain the norm, companies offering smart meeting assistants will face increasing privacy scrutiny. The outcome of this lawsuit could set an important precedent for protecting biometric privacy in digital workplaces. For those concerned about their own privacy during virtual meetings, it is wise to ask the meeting organizers what third-party tools are used and whether your data is actually protected.