Apple Dictation Privacy Risks: What Really Happens to Your Voice Data
In January 2026, Apple mailed settlement checks to millions of Siri users — part of a $95 million payout for secretly recording private conversations. If you dictate on a Mac, you should know exactly where your voice data goes, who can access it, and what Apple’s privacy promises actually mean in practice.
This guide breaks down the technical reality of Apple’s dictation privacy, what recent research has uncovered, and what alternatives exist for professionals who can’t afford to take risks with sensitive data. For a complete walkthrough of setting up dictation on macOS, see our Mac dictation setup guide.
How Apple Dictation Actually Works
Apple’s built-in dictation processes your voice differently depending on which Mac you own.
Apple Silicon Macs (M1 and Later)
On Macs with Apple Silicon, keyboard dictation runs on-device by default. Your spoken audio is processed by a local speech recognition model running on the Mac’s Neural Engine. No audio data leaves your computer for basic dictation, and no internet connection is required.
This sounds reassuring — until you look at the details.
Intel Macs
If you own an Intel Mac, all dictation audio is sent to Apple’s servers for processing. There is no on-device option. Your voice data travels over the internet to Apple’s data centers, gets transcribed, and the text is sent back. Apple’s privacy page states this data is “not associated with your Apple Account” and is retained for up to six months to improve dictation — but it still leaves your machine.
To check which mode your Mac uses: go to System Settings > Keyboard > Dictation. If you see an option to disable sending audio to Apple, your Mac supports on-device processing. Intel Macs don’t show this option because cloud processing is the only available mode.
The Siri Complication
Even on Apple Silicon Macs, Siri dictation (as opposed to keyboard dictation) still involves server communication. And Siri is deeply integrated with dictation — when you use “Hey Siri” or press and hold the microphone button, different privacy rules apply than when you use keyboard dictation. Most users have no way to tell which processing path their dictation is taking at any given moment.
The $95 Million Siri Privacy Settlement
In January 2025, Apple agreed to settle a class action lawsuit for $95 million — one of the largest privacy settlements in consumer technology history.
What Was Alleged
The case began in 2019 when California resident Fumiko Lopez discovered her Siri-enabled devices were activating without the “Hey Siri” trigger phrase. According to court filings, Siri allegedly:
- Recorded private conversations during unintended activations
- Transmitted recordings to Apple contractors for quality review — without user consent
- Shared data with third parties including, allegedly, advertisers for ad targeting
The settlement covered current or former owners of any Siri-enabled device — iPhone, iPad, Apple Watch, MacBook, iMac, HomePod, iPod touch, or Apple TV — whose private communications were obtained by Apple through an unintended Siri activation.
What Apple Said
Apple denied all allegations throughout the proceedings, stating that Siri has been “engineered to protect user privacy from the beginning,” that Siri data was never used to build marketing profiles, and was never sold for any purpose.
The company agreed to pay $95 million without admitting wrongdoing. Settlement checks were distributed starting January 23, 2026, with actual payouts averaging about $8 per device — far below the $20 per device cap.
What It Means for Dictation Users
Whether or not you believe Apple’s denials, the settlement established a legal record: Apple’s voice assistant was activating and recording when users didn’t intend it to, and those recordings were shared with contractors. If you dictate sensitive information — medical notes, legal documents, financial data — that precedent matters.
Black Hat 2025: What Researchers Actually Found
Seven months after the settlement, the picture got worse.
At Black Hat USA 2025, Yoav Magid from Israeli cybersecurity firm Lumia Security presented research showing Apple Intelligence routinely transmits sensitive user data to Apple servers beyond what its privacy policies indicate.
Key Findings
Siri scans your device and reports what it finds. When given a voice prompt, Siri automatically scans users’ devices for installed applications related to the query and transmits this information to Apple servers. Ask about the weather, and Siri identifies and reports all weather-related apps on your device.
Location data accompanies every request. Regardless of whether location information is relevant to your query, location data is sent to Apple servers with every Siri request. Ask Siri to set a timer, and your location goes to Apple’s servers.
Dictated messages are sent to Apple servers — even on encrypted platforms. Messages dictated through Siri to platforms like WhatsApp are transmitted to Apple servers before being sent. This undermines the end-to-end encryption that WhatsApp promises. Testing confirmed that this data transmission continues even when users explicitly disable settings that allow Siri to “learn” from specific applications.
Audio playback metadata is collected. The names of songs, podcasts, or videos you’re playing are sent to Apple servers without clear user visibility into these data flows.
The Two-Policy Problem
Perhaps the most concerning finding: identical requests are routed through different privacy frameworks depending on how you phrase them.
Asking “What is the weather today?” sends data to Siri servers under one privacy policy. Asking “Ask ChatGPT what is the weather today?” routes the request through Apple Intelligence’s Private Cloud Compute under entirely different terms. As the researcher noted: “Two similar questions, two different traffic flows, two different privacy policies.”
Users have no way to predict which privacy framework applies to their interactions.
Apple’s Response
Apple acknowledged some aspects of the research after Lumia reported the issues in February 2025. Initially, Apple indicated it would work toward fixes. By July, Apple shifted its position, telling researchers that the message transmission behavior was “not a privacy issue related to Apple Intelligence.”
What Apple’s Privacy Policy Actually Says
“Improve Siri and Dictation” is opt-in. By default, Apple says it does not retain audio recordings. If you opt in to “Improve Siri and Dictation,” Apple stores audio samples and transcripts to improve speech recognition. You can opt out at any time in System Settings > Privacy & Security > Analytics & Improvements.
But metadata is always collected. Even without the opt-in, Siri processes generate metadata — the type of request, device information, and (as Black Hat research showed) location data and installed app information. Apple’s policy doesn’t clearly define the retention period for all metadata categories.
The gap between marketing and reality. Apple’s long-running “What happens on your iPhone stays on your iPhone” campaign created an expectation of complete local processing. The reality is more nuanced: some processing is local, some is cloud-based, and which path your data takes depends on factors most users can’t predict or control.
Who Should Be Most Concerned
For casual dictation — shopping lists, text messages to friends — Apple’s privacy practices are likely adequate for most people. But certain professionals face real risk:
Healthcare Professionals
Dictating patient information — SOAP notes, medical histories, diagnostic observations — through a system that may send data to cloud servers creates HIPAA compliance concerns. Even if Apple’s servers are secure, the act of transmitting protected health information (PHI) to a third party without a Business Associate Agreement (BAA) is a potential violation. Apple does not offer a BAA for Siri or Dictation services.
Legal Professionals
Attorney-client privilege requires that communications remain confidential. If dictated legal notes, client conversations, or case strategy are transmitted to Apple’s servers — even temporarily — the privilege may be waived. The $95M settlement demonstrated that voice data was, at times, reviewed by third-party contractors.
Financial Professionals
Client financial data, account numbers, investment strategies, and trading information are subject to regulations including SEC Rule 17a-4, SOX, and state privacy laws. Cloud-processed dictation introduces an unauthorized data processor into the chain of custody.
Anyone Handling Sensitive Information
Journalists protecting sources, executives discussing M&A activity, HR professionals handling employee complaints, therapists taking session notes — anyone whose dictated words could cause harm if disclosed should understand exactly where that data goes.
The Alternative: On-Device Dictation That Can’t Upload Your Data
The fundamental problem with Apple’s approach isn’t malice — it’s architecture. When cloud processing exists as a fallback, there’s always a path for your data to leave your device. Policy changes, software updates, or bugs can alter which data takes that path.
A different architectural approach eliminates the risk entirely: dictation software that processes everything on-device, with no cloud component at all.
VoicePrivate takes this approach. It runs a purpose-built speech recognition engine locally on your Mac — no internet connection required, no server communication, no data upload path. Your voice data physically cannot leave your machine because the software has no mechanism to send it anywhere.
This isn’t a policy promise that could change with the next terms-of-service update. It’s an architectural guarantee you can verify yourself by monitoring network traffic while dictating.
Key differences from Apple Dictation:
- Works identically on Intel and Apple Silicon Macs (always local)
- No Siri integration means no unintended activation recordings
- No metadata collection — no installed app scanning, no location transmission
- Custom vocabularies for medical, legal, and financial terminology
- 25+ languages supported (up to 99 with specialty editions)
- HIPAA-suitable architecture (no PHI ever transmitted)
For a detailed side-by-side breakdown, see our VoicePrivate vs macOS Dictation comparison.
How to Verify Any Dictation App’s Privacy Claims
Don’t take any company’s word for it — including ours. Here’s how to verify:
- Monitor network traffic while dictating using tools like Little Snitch, Wireshark, or macOS’s built-in
nettopcommand - Disconnect from the internet and test whether dictation still works (truly local apps function normally offline)
- Check DNS queries during dictation sessions for connections to cloud services
- Review the app’s network entitlements in its Info.plist file
If an app claims local processing but phones home during dictation, you have your answer.
Frequently Asked Questions
Does Apple record your dictation?
It depends on your device and settings. On Intel Macs, dictation audio is sent to Apple servers for processing. On Apple Silicon Macs, keyboard dictation is processed on-device, but Siri-related dictation may still involve server communication. If you’ve enabled “Improve Siri and Dictation,” Apple retains audio samples regardless of device.
Is Mac dictation private?
Partially. On Apple Silicon Macs, keyboard dictation is processed locally. However, Black Hat 2025 research showed that Siri still sends metadata (location, installed apps, audio playback info) to Apple servers with every request. On Intel Macs, all dictation audio is sent to Apple’s servers.
Does Siri listen when you’re not using it?
The $95M settlement specifically addressed allegations that Siri activated and recorded without the “Hey Siri” trigger phrase. Apple denied intentional eavesdropping but paid $95 million to settle the claims. The settlement covered recordings made during unintended activations.
What happened in the Apple Siri lawsuit?
In January 2025, Apple agreed to pay $95 million to settle a class action lawsuit alleging Siri recorded private conversations without consent and shared them with third-party contractors. Settlement checks were distributed starting January 2026, averaging about $8 per device.
How can I dictate privately on Mac?
For maximum privacy, use a dictation app with 100% on-device processing and no cloud component — such as VoicePrivate. Alternatively, on Apple Silicon Macs, disable “Improve Siri and Dictation” in System Settings and use only keyboard dictation (not Siri). On Intel Macs, there is no way to keep dictation fully private using Apple’s built-in tools.
Sources: Apple Legal — Siri, Dictation & Privacy; Lumia Security — “AppleStorm: Unmasking the Privacy Risks of Apple Intelligence” (Black Hat USA 2025); CyberScoop; NPR; The Washington Post; Apple Newsroom.