
30 Seconds Of Therapy: HIPAA Compliance Considerations When Using AI
Sep 22
3 min read
0
23
0

START
Use HIPAA-Compliant AI Apps
Only use AI tools that protect Protected Health Information (PHI) with encryption and Business Associate Agreement (BAA).
Example Tool: OneStep (FDA-listed gait analysis app).
Download OneStep on a HIPAA-secure device.
Obtain verbal or written patient consent before recording.
Record gait or mobility during a home visit.
Review encrypted progress reports within the app — do not export data to unsecured platforms.
Why it matters: This lets you use cutting-edge AI while staying compliant. Patients benefit from precise tracking, and you stay safe under HIPAA.
Encrypt All Patient Data
Any patient-related audio, notes, or video must be stored in encrypted form (scrambled so only authorized people can access it).
Example Tool: DeepScribe (AI medical scribe).
Enable AI transcription during your patient visit.
Speak naturally while the AI drafts your note.
Edit the draft in-app and finalize the note.
Save only inside the HIPAA-compliant system — never email or text it.
Why it matters: Encryption ensures even if your device is lost, PHI remains secure. This is non-negotiable under HIPAA.
Get a BAA Before Using Any AI Tool
BAA is a legal contract where the AI vendor promises to follow HIPAA rules with your patient data.
Before using an AI app, email or call the vendor and ask: “Do you provide a signed BAA for covered entities?”
If they don’t, do not use it.
Why it matters: Without a BAA, even if the tool looks secure, you’re still responsible for HIPAA violations.
STOP
Free AI Apps for Patient Information
Free AI tools like consumer dictation apps, ChatGPT’s free version, or note-taking apps don’t follow HIPAA safeguards. They may store, reuse, or even train their AI on the data you input. If you type or dictate PHI (Protected Health Information), it can be exposed or sold.
Clinician example: A PT dictates a patient’s name, diagnosis, and gait details into a free dictation app. That information may be stored on non-HIPAA servers with no encryption — a direct HIPAA violation.
What to do instead: Use HIPAA-compliant AI tools like DeepScribe or Augmedix that sign BAAs and encrypt every interaction.
Saving PHI on Personal Phones or Laptops Without Encryption
Why it’s a problem: Even if you use a secure app, downloading reports or screenshots to your personal phone or laptop creates a weak spot. If your device is lost, stolen, or hacked, the patient data is exposed. HIPAA fines don’t care if it was “an accident.”
Clinician example: An OT saves a progress report from an app onto their personal laptop for “quick access.” That laptop has no encryption or password lock. If it’s stolen, every patient listed in that file is compromised.
What to do instead: Only access PHI through HIPAA-compliant apps and platforms that encrypt data. If you must use your device, enable encryption and strong passwords, and never download PHI outside the secure app.
Assuming “Secure-Looking” Apps Are Compliant
Why it’s a problem: Just because an app has a lock icon, says “secure,” or requires a password doesn’t mean it’s HIPAA compliant. True HIPAA compliance requires:
Technical safeguards (encryption, access controls, audit logs)
Administrative safeguards (training, risk assessments)
Legal safeguards (signed Business Associate Agreement)
Clinician example: A home health nurse uses a wellness app that looks professional and says “secure data.” But since it doesn’t provide a BAA, it is not HIPAA-compliant. If PHI is entered, the clinician is liable for a violation.
What to do instead: Before using any AI app, ask the vendor for a signed BAA and confirm HIPAA compliance in writing. No BAA = not compliant, no matter how “official” the app looks.
WHY
Legal & Financial Risk
Using AI without HIPAA compliance carries steep consequences. In 2025, penalties can start at about $141 per violation and climb to over $70,000 per violation, with annual maximums reaching $2.1 million for severe or uncorrected breaches. Serious cases may also bring criminal fines and prison time. In California, laws like the CMIA add extra penalties, meaning clinicians could face both federal and state fines. Beyond money, violations may trigger audits, loss of Medicare/Medi-Cal contracts, and reputational harm that can be more damaging than the financial hit.
Patient Trust
Beyond the dollars and penalties, HIPAA violations compromise something harder (but equally vital): the trust between you (the clinician) and your patient.
When a patient invites you into their home, shares sensitive medical, mobility, cognitive, or functional data, they expect confidentiality. If an AI tool leaks something — photos, video, movement data, speech, personal history — that trust is broken.
Once trust is broken, it’s hard to rebuild. A patient may stop sharing information, may reduce their therapy participation, or even switch providers. This can hurt outcomes.
Also, word spreads. A patient who feels their privacy was compromised may complain publicly, leave bad reviews, or file ethical complaints. This can damage your professional reputation, affect referrals, even impact licensing or agency oversight.





