Privacy Policy
"Patient privacy isn't a feature. It's the architecture."
Our Mission
VoicePrivate — Healthcare Edition exists because patient data is among the most intimate information entrusted to any professional. When you dictate session notes, treatment plans, or clinical documentation, every word contains protected health information. No technology vendor should require you to expose that data as a cost of doing business.
We built VoicePrivate — Healthcare Edition to prove that world-class clinical transcription doesn't require cloud processing. Every word you speak is processed by AI models running directly on your hardware, with an 74,000+ term medical dictionary that ensures accuracy. Your audio never touches our servers because we don't have servers for audio processing.
Founded by engineers frustrated by the false choice between transcription quality and patient privacy. Patient privacy should be the default, not an add-on.
privacy Architecture
Here's exactly what happens when you use VoicePrivate — Healthcare Edition — and what doesn't happen.
Privacy commitment
VoicePrivate — Healthcare Edition is designed to ensure that the use of voice-to-text technology protects patient privacy.
PHI Maintained By Architecture
Audio data: Never transmitted. All speech processing occurs exclusively on the provider's device using local AI models.
Transcription text: Never transmitted. All transcribed text is stored only in a local database on the provider's device.
Patient communications: Never exposed to third parties. No cloud servers, no API calls, no third-party AI services receive any protected health information.
Clinical documentation: All work product created through VoicePrivate — Healthcare Edition remains exclusively under the provider's control.
What we don't do
- ✗ We never transmit patient audio to any server
- ✗ We never use PHI to train AI models
- ✗ We never share provider data with third parties
- ✗ We never track dictation behavior or content
- ✗ We never embed third-party SDKs or tracking pixels
Network requests VoicePrivate — Healthcare Edition makes
License validation: A simple encrypted check to verify your subscription status. Contains only your license key — no audio, no text, no patient data, no metadata.
Model downloads: One-time download of AI model files from our CDN. After download, everything runs offline.
iCloud sync (Team plan): End-to-end encrypted sync of transcription history. Encrypted on-device before upload — Apple cannot read your data.
Privacy by architecture
Encrypted Backups
All iCloud backups and exports are encrypted using industry-standard AES-256-GCM encryption. Encryption keys are generated and stored locally on your device — never transmitted.
Local Key Management
Encryption keys are generated and stored locally on your device using platform-provided security APIs. Keys never leave your device.
Open-Source AI Models
The AI models that process your voice are open source (MIT license). The underlying speech recognition models are fully transparent and publicly auditable.
Minimal Permissions
VoicePrivate — Healthcare Edition requests only the permissions it needs to function. Microphone access requires explicit user permission granted through macOS system prompts.
Get in touch
Questions about our privacy practices or need vendor security documentation? We're happy to help.
Email us anytime at
Contact Us