Last week, the Office for Civil Rights (OCR) announced its second enforcement action and settlement with a provider  for failing to comply with HIPAA’s patient access requirements.  Korunda Medical, LLC, a primary care and pain management practice in Florida, agreed to pay $85,000 and comply with a Corrective Action Plan (CAP) as a result of a patient’s complaint that it refused to provide the records in the requested electronic format and charged more than the reasonable, cost-based fee prescribed under HIPAA.

Korunda also apparently made the fatal mistake of ignoring OCR’s technical assistance. As I noted in connection with the $3 million resolution amount paid by a New York hospital system, when OCR offers technical assistance, the covered entity (or business associate) should follow it.

Payment of $85,000 may pale in comparison with payment of $3 million, but given the relative ease of complying with HIPAA’s patient access requirements, and added to the time and expense of responding to OCR’s investigation and negotiating the settlement agreement, it’s not an insignificant amount. In addition, compliance with the CAP will require additional expenditures of time and resources by Korunda. The CAP requires Korunda to submit the following to the U.S. Department of Health and Human Services (HHS):

* revised policies and procedures related to patient access that identify how Korunda calculates a reasonable, cost-based fee;

* training materials related to individual access rights, and then provide training to all workforce members;

* lists of requests for access (including date the request is received, the date the request is completed, the format requested, the format provided, the number of pages (if paper), the cost charged, including postage, as well as all documentation related to denials of requests)

* notification of any failure by a member of its workforce to comply with its access policies and procedures

* annual reports regarding the implementation of the CAP requirements

OCR has been focused on HIPAA’s access rights for the last few years. See here and here for posts from 2016 on this topic, and here for OCR’s first Resolution Agreement involving an access rights violation (also triggering an $85,000 settlement amount and similar CAP). Responding in a timely manner to patient access requests, providing the information in the format requested, not overcharging, and jumping on any technical assistance OCR sends your way are easy ways to avoid being the third example of, as OCR Director Severino put it,“bureaucratic inertia.”

More and more often, health care data is stolen or made inaccessible by targeted ransomware attacks. The Office for Civil Rights (OCR) published a newsletter this week that provides warnings for HIPAA covered entities and business associates. It also provides practical tips to prevent and help you survive these attacks.

OCR’s warnings should resonate with covered entities and business associates alike:

  1. You are a ransomware target. 

    “Cybercriminals … found that customizing their attacks to specific, “quality” targets led to an increase in the amount of ransom payments.  Organizations commonly targeted by this type of attack have sensitive data, high data availability requirements, low tolerance for system downtime, and the resources to pay a ransom.  Many healthcare organizations fit this profile, and have become targets.”

  2. Cybercriminals may already be lurking in your information system, waiting to attack. 

    “Prior to initiating an attack, a malicious actor usually gains unauthorized access to a victim’s information system for the purpose of performing reconnaissance to identify critical services, find sensitive data, and locate backup. After this is done, the ransomware is deployed in a manner that produces maximum effect, infecting as many devices and as much data as possible and encrypting backup files so that recovery is difficult, if not impossible.”

  3. Cybercriminals often gain access by tricking your employees and authorized system users. 

    “Information system users remain one of the weakest links in an organization’s security posture.  Social engineering, including phishing attacks, is one of the most successful techniques used by threat actors to compromise system security.”

The newsletter then offers specific and practical tips as to how taking HIPAA Security Rule compliance seriously can help you avoid and/or quickly recover from targeted ransomware attacks. Here’s a summary of five key tips that should be at the top of your organization’s ransomware-prevention list:

  1. Train employees to avoid and report phishing scams. 

    “A training program should make users aware of the potential threats they face and inform them on how to properly respond to them.  This is especially true for phishing emails that solicit login credentials.  Additionally, user training on how to report potential security incidents can greatly assist in an organization’s response process by expediting escalation and notification to proper individuals.”

  2. Review and test security incident response procedures. 

    “Quick isolation and removal of infected devices from the network and deployment of anti-malware tools can help to stop the spread of ransomware and to reduce the harmful effects of such ransomware.  Response procedures should be written with sufficient details and be disseminated to proper workforce members so that they can be implemented and executed effectively.  Further, organizations may consider testing their security incident procedures from time to time to ensure they remain effective.”

  3. Maintain recoverable, secure, and up-to-data backups of all electronic protected health information. 

    “Organizations should keep in mind that threat actors have recently been actively targeting backup systems and backup data to prevent recovery.”

  4. Regularly check and strengthen access controls. 

    “[This measure will] stop or impede an attacker’s movements and access to sensitive data; e.g., by segmenting networks to limit unauthorized access and communications.  Further, because attacks frequently seek elevated privileges (e.g., administrator access), entities may consider solutions that limit the scope of administrator access, as well as solutions requiring stronger authentication mechanisms when granting elevated privileges or access to administrator accounts.”

  5. Regularly install software updates and patches.

With the explosion of health data sifting through cutting-edge companies, industry stakeholders are left to wonder how wearable devices, wellness programs, health applications, and the like should be regulated.

Despite current belief, the Health Insurance Portability and Accountability Act (“HIPAA”) does not regulate all health information. HIPAA regulates health information collected and retained by covered entities and imposes downstream obligations on entities called business associates. HIPAA began with a limited purpose and was not created to cover all health information held by all entities. Created in 1996, HIPAA was originally designed to address the exchange of electronic health information and portability, so that an employee could maintain health insurance between employers.

Today’s perceived gaps in HIPAA, therefore, seem plausible, given its history and the realization that when HIPAA was created 23 years ago, the health landscape was without today’s innovative health companies collecting and aggregating health data in new ways for new purposes and the accompanying geometric increase in the complexity and types of risk. While newer health tech companies may find themselves outside the HIPAA regime, a recent Senate Bill hopes to expand HIPAA to include health information collected by fitness trackers, health-focused social media sites, and direct to consumer genetic testing companies. Though the Senate Bill has stayed stagnate, companies have seen enforcement beyond the HIPAA regime.

In March 2017, New York Attorney General announced a settlement with developers of three health apps and alleged the creators used misleading claims and had irresponsible privacy practices with unclear and inconsistent statements about how they collected and shared users’ personal information with third parties. The Attorney General alleged violations to New York’s Consumer Protection Act and False Advertising laws.

So what is the moral of the story? Just because your health company does not fit squarely within the HIPAA regime, you aren’t excluded from being regulated. Keep in mind applicable state laws like a state’s Consumer Fraud Act. Consider obligations to federal regulators like the FTC regarding deceptive consumer practices and FDA’s oversight over medical devices, for example.

Have a good understanding of what your company is (and what it isn’t). If you’re a covered entity or business associate, your obligation to comply with HIPAA is clear. However, consider wearable devices, like Fitbit and smartwatches that track users’ heart rate and sync their health data to smartphone apps. Consider wearable biosensors that monitor patients’ vital signs, temperature, and body posture. A deeper analysis on when health information shifts from HIPAA protected to non-HIPAA protected, can be found on a separate Alert by Elizabeth Litten.

A large New York hospital system learned this lesson the expensive way.  According to a U.S. Department of Health and Human Services (HHS) press release issued earlier this week, the Office for Civil Rights (OCR) investigated a hospital system breach back in 2010 involving the loss of an unencrypted flash drive. According to the press release, OCR provided technical assistance to the hospital system as a result of that breach.

The hospital system apparently didn’t follow or benefit from OCR’s technical assistance, as it reported a breach in 2013 involving the loss of an unencrypted flash drive. According to OCR,

Despite the previous OCR investigation, and [the hospital system’s] own identification of a lack of encryption as a high risk to ePHI, [the hospital system] permitted the continued use of unencrypted mobile devices.”

The hospital system then reported a third incident involving the theft of an unencrypted mobile device (an unencrypted personal laptop used by a resident surgeon) in 2017.  Although the laptop contained the PHI of only 43 patients, it wasn’t the size of the breach that likely triggered the $3 million payment amount.  The high payment amount seems directed at the hospital system’s apparent continuing failure to implement fairly straightforward security measures.

This hospital system had three strikes involving unencrypted devices before being hit with the $3 million resolution amount, and three important lessons can be learned from this resolution agreement. First, correct identified vulnerabilities. Second, when OCR offers technical assistance, follow it. And third, make sure you have a mobile device policy that requires encryption or addresses why encryption is not feasible.

OCR likely also considered the large size of the hospital system, and the relatively simple security policies and procedures the hospital system could have implemented to prevent the third breach when it imposed the $3 million penalty and two year corrective action plan.  However, even small covered entities and business associates should pay attention to this resolution agreement and take steps to minimize the risk of mobile device breaches.

“New York Gov. Andrew Cuomo recently signed legislation that will effectively prohibit ambulance and first response service providers from disclosing or selling patient data to third parties for marketing purposes.

The bill was signed into law on October 7. The new law bans the sale of patient data, or individually identifying information to third parties, outside of sales to health providers, the patient’s insurer, and other parties with appropriate legal authority.

Under the law, all information that can be used to identify a patient is protected from sales for marketing purposes, such as advertising, detailing, marketing, promotion, or any activity used to influence sales.”

Details from HealthIT Security.

This post also appears on Fox Rothschild’s Privacy Compliance & Data Security blog.

Artificial Intelligence (“AI”) refers to algorithm tools that simulate human intelligence, mimic human actions, and can incorporate self-learning software. The benefits of AI tech can reduce spending, provide alternative treatment ideas, and improve patient experience, diagnosis, and outcome.

Consider virtual health assistants who deliver medication alerts and patient education, AI used to detect abnormalities in x-rays and MRIs, and AI that gives simultaneous feedback to patients and their physicians from elements captured on patient smartphones and wearable devices.  But with the advent of unchecked AI comes concerns related to health information  privacy and unconscious bias.

Privacy Concerns in AI Health Tech

AI advances could negatively impact health data privacy. The level of impact depends, to some extent, on how we define health data. “Health data” generally refers to information concerning the physical or mental health status of an individual and the delivery of or payment for health services to the individual. It incorporates data exchanged through traditional health technologies used by health care providers and health plans, as well as data exchanged through newer technologies, like wearable devices and virtual assistants.

Regulations do not yet fully address privacy concerns in AI health tech. HIPAA, for example, legislation that was originally enacted in 1996, requires covered entities and business associates to implement standards to ensure the privacy and security for protected health information. Those standards, however, may not apply to tech companies that use ever-evolving third party apps or algorithms to access the data. Experts express concerns with companies collaborating to re-identify data formerly considered de-identified.[1] Consider data brokers who mine personal health data with AI tech who then sell their acquired health data to third parties like marketers, employers, and insurers. While a tech company’s relationship with a health insurance company falls under HIPAA’s scope, it is less clear what privacy laws would apply to tech companies that limit clientele to entities that are not otherwise directly or indirectly subject to HIPAA, such as life or disability insurers, which may be the case for marketers and employers.

Bias Concerns in AI Health Tech

AI health tech must be developed and trained responsibly. AI learns by identifying patterns in data collected over many years. If the data collected reflects historical biases against vulnerable populations, the data projected will only exasperate those biases. Biases creep into data sets through various ways. Consider whether input data has incomplete data sets for “at-risk” populations (groups with a historically higher risk for certain health conditions or illnesses). Consider the potential for under-diagnosis of health conditions within certain populations. Correcting for these biases in the development of the data set and in training processes will help to avoid the creation of biased results in AI health tech.

Companies should consider safeguards when employing their AI health tech. Employee training is paramount. While AI promises a host of benefits, those involved in its creation and use must be aware of the potential for bias. Companies should also consider data integrity and built in biases. Consider the information AI tech relies on and consider rechecking the data reviewed by AI.  Lastly, diversifying those engaged in creating, testing, and using the health care tech could also decrease bias.


Last May, around the time many schools let out for the summer, the Office for Civil Rights (“OCR”) published guidance entitled “Direct Liability of Business Associates” (the “Guidance”), which focuses, not surprisingly, on OCR’s ability to take enforcement action directly against HIPAA business associates. I meant to write about this guidance before Memorial Day, but since the back-to-school season is a good time to get things (including business associate agreements or “BAAs”) in order, this timing feels right.

The Guidance caught my attention not because it lists ten HIPAA failures or violations for which business associates are directly liable, but it calls out one specific HIPAA violation that will fall on the shoulders of the contracted covered entity:

… OCR lacks the authority to enhance the “reasonable, cost-based fee” limitation in 45 C.F.R. § 164.524(c)(4) against business associates … .

In other words, the OCR explains that, if a covered entity engages a business associate to fulfill an individual’s request for access to protected health information, it is the covered entity’s responsibility to ensure that the business associate complies with HIPAA’s “reasonable, cost-based fee” limitation (and any more stringent state law requirement).

We’ve posted on the topic of individual access rights under HIPAA (see here and here), and have also posted on the topic of what amounts can be charged, both under HIPAA and under state law (see here and here). What the Guidance compels me to point out, though, is that covered entities often include a provision in BAAs that requires the business associate to respond to an individual’s access request by either notifying the covered entity of the request or by providing the requested electronic or paper copy directly. The provision may require the business associate to comply with the HIPAA regulatory requirements regarding the timing of the response, either in terms of notifying the covered entity within a specified time period or by responding directly to the individual.

However, a provision stating simply that the business associate must “comply with 45 C.F.R. § 164.524 [the regulation governing individuals’ access rights]” may not be enough to ensure that the business associate limits the amount charged as per the regulation, which potentially creates unexpected exposure for noncompliance for the covered entity. Thus, in light of the Guidance, covered entities should review their BAAs and consider whether updates are required to such provisions. If they don’t they may end up dealing with an OCR enforcement action that could have been prevented with a few well-placed BAA words.

The Dutch Data Protection Authority has levied a fine of 460,000 euros on Haga Hospital for insufficient security following an investigation revealing that dozens of hospital staff had unnecessarily checked the medical records of a well-known Dutch person.

In addition, if the hospital has not improved security before October 2, 2019, it must pay 100,000 euros every two weeks, up to a maximum of 300,000 euros.

According to, the authority’s chairman Aleid Wolfsen said: “The relationship between a healthcare provider and a patient should be completely confidential. Also within the walls of a hospital. It doesn’t matter who you are.”

Key takeaways:
  • Have adequate logs in place: The hospital must regularly check who consults which file.
  • Good security requires authentication that involves at least two factors.

Details from the Dutch Data Protection Authority.

The California Consumer Privacy Act (CCPA) will take effect on January 1, 2020 and regulates most entities that collect personal information of California residents.  CCPA was patterned after the European Union’s General Data Protection Regulation (GDPR) which went online on May 28, 2018 and has been called “GDPR-Lite.”  In May, Fox Rothschild partner Odia Kagan described when CCPA applies in an Alert that listed the categories of entities who are affected: generally,  for-profit businesses who do business in California, collect California consumers’ personal information and determine the purposes and means of processing that information, and have at least $25 million in annual gross revenues, buy, sell, share and/or receive the personal information of at least 50,000 California consumers, households or devices, per year, or derive at least 50 percent of their annual revenue from selling California consumers’ personal information, as wells as entities that control or are controlled by such businesses and share common branding.  Each of those terms has a technical definition that should be carefully reviewed.   But isn’t there a HIPAA exception?

Yes, CCPA contains a carve-out for HIPAA covered entities, but it is not as broad as you may have heard.  In a recent alert entitled  Where HIPAA Stops, CCPA Begins – Why Covered Entities and Business Associates Cannot Ignore the New California Data Privacy Law, Fox Rothshchild partners Odia Kagen and Elizabeth Litten explain when information that appears to be exempt PHI may fall under the new CCPA:

Personal information created, received, maintained or transmitted by companies subject to HIPAA is likely subject to CCPA if it falls into one of the following five categories:

  1. It is not created or collected as part of the payment, treatment or health care operations trifecta
  2. It was never PHI (or is excluded from the definition of PHI) under HIPAA
  3. It was once PHI, but has been de-identified under HIPAA
  4. It is not PHI, but is derived from PHI
  5. It is PHI that is used for research purposes in accordance with HIPAA

The bottom line is that what you think is PHI and exempt from CCPA may not be covered by the carve-out after all. For details, see the Alert.

“The right to be forgotten does not apply in principle to medical records. However, as a patient, you may ask your health care provider to remove data from your medical record,” according to the Dutch Data Protection Authority, Autoriteit Persoonsgegevens (AG), which has issued a guidance on GDPR and medical records.

Key takeaways:

  • For medical data that are not covered by the Medical Treatment Agreement Act, such as nursing care and in-home care, personal data should not be kept longer than necessary.
  • The personal data that you have actively and consciously provided is covered by the right to data portability. This also applies to the data that you have provided indirectly through the use of a service or device. For example, the data that your pacemaker or blood pressure monitor generates.
  • The right to data portability does not apply to the conclusions, diagnoses, suspicions or treatment plans that your health care provider establishes on the basis of the information you provide.
  • As a health care provider, you must in any case use two-factor authentication. Such as logging in with DigiD in combination with SMS.

Read the full guidance.