top of page
Illustration of a doctor pointing while holding a clipboard, facing a large silhouette of a human head with a brain depicted as a circuit board, symbolizing the integration of artificial intelligence in healthcare

START


  1. Use HIPAA-Compliant AI Apps

    1. Only use AI tools that protect Protected Health Information (PHI) with encryption and  Business Associate Agreement (BAA).

      1. Example Tool: OneStep (FDA-listed gait analysis app).

        1. Download OneStep on a HIPAA-secure device.

        2. Obtain verbal or written patient consent before recording.

        3. Record gait or mobility during a home visit.

        4. Review encrypted progress reports within the app — do not export data to unsecured platforms. 

      2. Why it matters: This lets you use cutting-edge AI while staying compliant. Patients benefit from precise tracking, and you stay safe under HIPAA.


  2. Encrypt All Patient Data

    1. Any patient-related audio, notes, or video must be stored in encrypted form (scrambled so only authorized people can access it).

      1. Example Tool: DeepScribe (AI medical scribe).

        1. Enable AI transcription during your patient visit.

        2. Speak naturally while the AI drafts your note.

        3. Edit the draft in-app and finalize the note.

        4. Save only inside the HIPAA-compliant system — never email or text it. 

      2. Why it matters: Encryption ensures even if your device is lost, PHI remains secure. This is non-negotiable under HIPAA.


  3. Get a BAA Before Using Any AI Tool

    1. BAA is a legal contract where the AI vendor promises to follow HIPAA rules with your patient data.

      1. Before using an AI app, email or call the vendor and ask: “Do you provide a signed BAA for covered entities?”

      2. If they don’t, do not use it.

      3. Why it matters: Without a BAA, even if the tool looks secure, you’re still responsible for HIPAA violations.


STOP


  1.  Free AI Apps for Patient Information

    1. Free AI tools like consumer dictation apps, ChatGPT’s free version, or note-taking apps don’t follow HIPAA safeguards. They may store, reuse, or even train their AI on the data you input. If you type or dictate PHI (Protected Health Information), it can be exposed or sold.

    2. Clinician example: A PT dictates a patient’s name, diagnosis, and gait details into a free dictation app. That information may be stored on non-HIPAA servers with no encryption — a direct HIPAA violation.

    3. What to do instead: Use HIPAA-compliant AI tools like DeepScribe or Augmedix that sign BAAs and encrypt every interaction.


  2. Saving PHI on Personal Phones or Laptops Without Encryption

    1. Why it’s a problem: Even if you use a secure app, downloading reports or screenshots to your personal phone or laptop creates a weak spot. If your device is lost, stolen, or hacked, the patient data is exposed. HIPAA fines don’t care if it was “an accident.”

    2. Clinician example: An OT saves a progress report from an app onto their personal laptop for “quick access.” That laptop has no encryption or password lock. If it’s stolen, every patient listed in that file is compromised.

    3. What to do instead: Only access PHI through HIPAA-compliant apps and platforms that encrypt data. If you must use your device, enable encryption and strong passwords, and never download PHI outside the secure app.


  3. Assuming “Secure-Looking” Apps Are Compliant

    1. Why it’s a problem: Just because an app has a lock icon, says “secure,” or requires a password doesn’t mean it’s HIPAA compliant. True HIPAA compliance requires:

      1. Technical safeguards (encryption, access controls, audit logs)

      2. Administrative safeguards (training, risk assessments)

      3. Legal safeguards (signed Business Associate Agreement)

    2. Clinician example: A home health nurse uses a wellness app that looks professional and says “secure data.” But since it doesn’t provide a BAA, it is not HIPAA-compliant. If PHI is entered, the clinician is liable for a violation.

    3. What to do instead: Before using any AI app, ask the vendor for a signed BAA and confirm HIPAA compliance in writing. No BAA = not compliant, no matter how “official” the app looks.


WHY


  1. Legal & Financial Risk

    1. Using AI without HIPAA compliance carries steep consequences. In 2025, penalties can start at about $141 per violation and climb to over $70,000 per violation, with annual maximums reaching $2.1 million for severe or uncorrected breaches. Serious cases may also bring criminal fines and prison time. In California, laws like the CMIA add extra penalties, meaning clinicians could face both federal and state fines. Beyond money, violations may trigger audits, loss of Medicare/Medi-Cal contracts, and reputational harm that can be more damaging than the financial hit.


  2. Patient Trust

    1. Beyond the dollars and penalties, HIPAA violations compromise something harder (but equally vital): the trust between you (the clinician) and your patient.

      1. When a patient invites you into their home, shares sensitive medical, mobility, cognitive, or functional data, they expect confidentiality. If an AI tool leaks something — photos, video, movement data, speech, personal history — that trust is broken.

      2. Once trust is broken, it’s hard to rebuild. A patient may stop sharing information, may reduce their therapy participation, or even switch providers. This can hurt outcomes.

      3. Also, word spreads. A patient who feels their privacy was compromised may complain publicly, leave bad reviews, or file ethical complaints. This can damage your professional reputation, affect referrals, even impact licensing or agency oversight.


Related Posts

Comments

Share Your ThoughtsBe the first to write a comment.
bottom of page