It was the wallet comment in the response brief filed by the Federal Trade Commission (FTC) in the U.S. Court of Appeals for the 11th Circuit that prompted me to write this post. In its February 9, 2017 filing, the FTC argues that the likelihood of harm to individuals (patients who used LabMD’s laboratory testing services) whose information was exposed by LabMD roughly a decade ago is high because the “file was exposed to millions of users who easily could have found it – the equivalent of leaving your wallet on a crowded sidewalk.”

However, if one is to liken the LabMD file (referred to throughout the case as the “1718 File”) to a wallet and the patient information to cash or credit cards contained in that wallet, it is more accurate to describe the wallet as having been left on the kitchen counter in an unlocked New York City apartment. Millions of people could have found it, but they would have had to go looking for it, and would have had to walk through the door (or creep through a window) into a private residence to do so.

I promised to continue my discussion of LabMD’s appeal in the U.S. Court of Appeals for the 11th Circuit of the FTC’s Final Order back in January (see prior post here), planning to highlight arguments expressed in briefs filed by various amici curiae in support of LabMD.   Amici include physicians who used LabMD’s cancer testing services for their patients while LabMD was still in business, the non-profit National Federation of Independent Business, the non-profit, nonpartisan think tank TechFreedom, the U.S. Chamber of Commerce, and others. Amici make compelling legal arguments, but also emphasize several key facts that make this case both fascinating and unsettling:

The FTC has spent millions of taxpayer dollars on this case – even though there were no victims (not one has been identified in over seven years), LabMD’s data security practices were already regulated by the HHS under HIPAA, and, according to the FTC’s paid litigation expert, LabMD’s “unreasonableness” ceased no later than 2010. During the litigation, …   a whistleblower testified that the FTC’s staff … were bound up in collusion with Tiversa [the cybersecurity firm that discovered LabMD’s security vulnerability, tried to convince LabMD to purchase its remediation services, then reported LabMD to the FTC], a prototypical shakedown racket – resulting in a Congressional investigation and a devastating report issued by House Oversight Committee staff.” [Excerpt from TechFreedom’s amicus brief]

An image of Tiversa as taking advantage of the visible “counter-top wallet” emerges when reading the facts described in the November 13, 2015 Initial Decision of D. Michael Chappell, the Chief Administrative Law Judge (ALJ), a decision that would be reversed by the FTC in the summer of 2016 when it concluded that the ALJ applied the wrong legal standard for unfairness. The ALJ’s “Findings of Fact” (which are not disputed by the FTC in the reversal, notably) include the following:

“121. On or about February 25, 2008, Mr. Wallace, on behalf of Tiversa, downloaded the 1718 File from a LabMD IP address …

  1. The 1718 File was found by Mr. Wallace, and was downloaded from a peer-to-peer network, using a stand alone computer running a standard peer-to-peer client, such as LimeWire…
  2. Tiversa’s representations in its communications with LabMD … that the 1718 File was being searched for on peer-to-peer networks, and that the 1718 File had spread across peer-to-peer networks, were not true. These assertions were the “usual sales pitch” to encourage the purchase of remediation services from Tiversa… .”

The ALJ found that although the 1718 File was available for peer-to-peer sharing via use of specific search terms from June of 2007 through May of 2008, the 1718 File was actually only downloaded by Tiversa for the purpose of selling its security remediation services. The ALJ also found that there was no contention that Tiversa (or those Tiversa shared the 1718 File with, namely, a Dartmouth professor working on a study and the FTC) used the contents of the file to harm patients.

In short, while LabMD may have left its security “door” unlocked when an employee downloaded LimeWire onto a work computer, only Tiversa actually walked through that door and happened upon LabMD’s wallet on the counter-top. Had the wallet been left out in the open, in a public space (such as on a crowded sidewalk), it’s far more likely its contents would have been misappropriated.

A patient requests a copy of her medical record, and the hospital charges the per-page amount permitted under state law. Does this violate HIPAA? It may.

In the spring of 2016, the Office of Civil Rights (OCR) within the U.S. Department of Health and Human Services, the agency that enforces HIPAA, issued a new guidance document on individuals’ right to access their health information under HIPAA (“Access Guidance”).   The Access Guidance reminds covered entities that state laws that provide individuals with a greater right of access (for example, where the state law requires that access be given within a shorter time frame than that required by HIPAA, or allows individuals a free copy of medical records) preempt HIPAA, but state laws that are contrary to HIPAA’s access rights (such as where the state law prohibits disclosure to an individual of certain health information, like test reports) are preempted by HIPAA.

For New Jersey physicians, for example, this means they may not automatically charge $1.00 per page or $100.00 for the a copy of the entire medical record, whatever is less, despite the fact that the New Jersey Board of Medical Examiners (“BME”) expressly permits these charges.  In fact, according to the Access Guidance, physicians should not charge “per page” fees at all unless they maintain medical records in paper form only.  New Jersey physicians also may not charge the “administrative fee” of the lesser of $10.00 or 10% of the cost of reproducing x-rays and other documents that cannot be reproduced by ordinary copying machines.  Instead, a New Jersey physician may charge only the lesser of the charges permitted by the BME or those permitted under HIPAA, as described below.

HIPAA limits the amount that covered entities may charge a patient (or third party) requesting access to medical records to only a “reasonable, cost-based fee to provide the individual (or the individual’s personal representative) with a copy” of the record.  Only the following may be charged:   

(1) the reasonable cost of labor for creating and delivering the electronic or paper copy in the form and format requested or agreed upon by the individual, but not costs associated with reviewing the request, searching for or retrieving the records, and segregating or “otherwise preparing” the record for copying;  

(2) the cost of supplies for creating the paper copy (e.g., paper, toner) or electronic media (e.g., CD or USB drive) if the individual requests the records in portable electronic media; and  

(3) actual postage costs, when the individual requests mailing. 

The fee may also include the reasonable cost of labor to prepare an explanation or summary of the record, but only if the individual, in advance, chooses to receive and explanation or summary AND agrees to the fee to be charged for the explanation or the summary.   

A provider may calculate its actual labor costs each time an individual requests access, or may develop a schedule of costs for labor based on the average (and HIPAA-permitted types of) labor costs incurred in fulfilling standard types of access requests.  However, a provider is NOT permitted to charge an average labor cost as a per-page fee unless the medical record is: (1) maintained in paper form; and (2) the individual requests a paper copy or asks that the paper record be scanned into an electronic format.  Thus, under HIPAA, a per-page fee is not permitted for medical records that are maintained electronically.  As stated in the Access Guidance, “OCR does not consider per page fees for copies of … [protected health information] maintained electronically to be reasonable” for purposes of complying with the HIPAA rules.   

A provider may also decide to charge a flat fee of up to $6.50 (inclusive of labor, supplies, and any applicable postage) for requests for electronic copies of medical records maintained electronically.    OCR explains that the $6.50 is not a maximum, simply an alternative that may be used if the provider does not want to go through the process of calculating actual or average allowable costs for requests for electronic copies. 

OCR has identified compliance with “individual access rights” as one of seven areas of focus in the HIPAA audits of covered entities and business associates currently underway, signaling its concern that physicians and other covered entities may be violating HIPAA in this respect.  All covered entities should, therefore, calculate what HIPAA permits them to charge when copies of medical records are requested by an individual (or someone acting at the direction of or as a personal representative of an individual), compare that amount to the applicable state law charge limits, and make sure that only the lesser of the two amounts is charged.

 

As she has done in January for several years, our good friend Marla Durben Hirsch quoted my partner Elizabeth Litten and me in Medical Practice Compliance Alert in her article entitled “MIPS, OSHA, other compliance trends likely to affect you in 2017.” For her article, Marla asked various health law professionals to make predictions on diverse healthcare matters including HIPAA and enforcement activities. Full text can be found in the January 2017 issue, but excerpts are included below.

Marla also wrote a companion article in the January 2017 issue evaluating the results of predictions she published for 2016. The 2016 predictions appeared to be quite accurate in most respects. However, with the new Trump Administration, we are now embarking on very uncertain territory in multiple aspects of healthcare regulation and enforcement. Nevertheless, with some trepidation, below are some predictions for 2017 by Elizabeth and me taken from Marla’s article.

  1. The Federal Trade Commission’s encroachment into privacy and security will come into question. Litten said, “The new administration, intent on reducing the federal government’s size and interference with businesses, may want to curb this expansion of authority and activity. Other agencies’ wings may be clipped.” Kline added, “However, the other agencies may try to push back because they have bulked up to handle this increased enforcement.”
  2. Telemedicine will run into compliance issues. As telemedicine becomes more common, more legal problems will occur. “For instance, the privacy and the security of the information stored and transmitted will be questioned,” says Litten. “There will also be heightened concern of how clinicians who engage in telemedicine are being regulated,” adds Kline.
  3. The risks relating to the Internet of things will increase. “The proliferation of cyberattacks from hacking, ransomware and denial of service schemes will not abate in 2017, especially with the increase of devices that access the Internet, known as the ‘Internet of things,’ warns Kline. “More devices than ever will be networked, but providers may not protect them as well as they do other electronics and may not even realize that some of them —such as newer HVAC systems, ‘smart’ televisions or security cameras that can be controlled remotely — are also on the Internet and thus vulnerable,” adds Litten. “Those more vulnerable items will then be used to infiltrate providers’ other systems,” Kline observes.
  4. More free enterprise may create opportunities for providers. “For example, there may not be as much of a commitment to examine mergers,” says Kline. “The government may allow more gathering and selling of data in favor of business interests over privacy and security concerns,” says Litten.

The ambitious and multi-faceted foray by the Trump Administration into the world of healthcare among its many initiatives will make 2017 an interesting and controversial year. Predictions are always uncertain, but 2017 brings new and daunting risks to the prognosticators.  Nonetheless, when we look back at 2017, perhaps we may be saying, “The more things change, the more they stay the same.”

It was nearly three years ago that I first blogged about the Federal Trade Commission’s “Wild West” data breach enforcement action brought against now-defunct medical testing company LabMD.   Back then, I was simply astounded that a federal agency (the FTC) with seemingly broad and vague standards pertaining generally to “unfair” practices of a business entity would belligerently gallop onto the scene and allege non-compliance by a company specifically subject by statute to regulation by another federal agency. The other agency, the U.S. Department of Health and Human Services (HHS), has adopted comprehensive regulations containing extremely detailed standards pertaining to data security practices of certain persons and entities holding certain types of data.

The FTC Act governs business practices, in general, and has no implementing regulations, whereas HIPAA specifically governs Covered Entities and Business Associates and their Uses and Disclosures of Protected Health Information (or “PHI”) (capitalized terms that are all specifically defined by regulation). The HIPAA rulemaking process has resulted in hundreds of pages of agency interpretation published within the last 10-15 years, and HHS continuously posts guidance documents and compliance tools on its website. Perhaps I was naively submerged in my health care world, but I had no idea back then that a Covered Entity or Business Associate could have HIPAA-compliant data security practices that could be found to violate the FTC Act and result in a legal battle that would last the better part of a decade.

I’ve spent decades analyzing regulations that specifically pertain to the health care industry, so the realization that the FTC was throwing its regulation-less lasso around the necks of unsuspecting health care companies was both unsettling and disorienting. As I followed the developments in the FTC’s case against LabMD over the past few years (see additional blogs here, here, here and here), I felt like I was moving from the Wild West into Westworld, as the FTC’s arguments (and facts coming to light during the administrative hearings) became more and more surreal.

Finally, though, reality and reason have arrived on the scene as the LabMD saga plays out in the U.S. Court of Appeals for the 11th Circuit. The 11th Circuit issued a temporary stay of the FTC’s Final Order (which reversed the highly-unusual decision against the FTC by the Administrative Law Judge presiding over the administrative action) against LabMD.

The Court summarized the facts as developed in the voluminous record, portraying LabMD as having simply held its ground against the appalling, extortion-like tactics of the company that infiltrated LabMD’s data system. It was that company, Tiversa, that convinced the FTC to pursue LabMD in the first place. According to the Court, Tiversa’s CEO told one of its employees to make sure LabMD was “at the top of the list” of company names turned over to the FTC in the hopes that FTC investigations would pressure the companies into buying Tiversa’s services. As explained by the Court :

In 2008, Tiversa … a data security company, notified LabMD that it had a copy of the [allegedly breached data] file. Tiversa employed forensic analysts to search peer-to-peer networks specifically for files that were likely to contain sensitive personal information in an effort to “monetize” those files through targeted sales of Tiversa’s data security services to companies it was able to infiltrate. Tiversa tried to get LabMD’s business this way. Tiversa repeatedly asked LabMD to buy its breach detection services, and falsely claimed that copies of the 1718 file were being searched for and downloaded on peer-to-peer networks.”

As if the facts behind the FTC’s action weren’t shocking enough, the FTC’s Final Order imposed bizarrely stringent and comprehensive data security measures against LabMD, a now-defunct company, even though its only remaining data resides on an unplugged, disconnected computer stored in a locked room.

The Court, though, stayed the Final Order, finding even though the FTC’s interpretation of the FTC Act is entitled to deference,

LabMD … made a strong showing that the FTC’s factual findings and legal interpretations may not be reasonable… [unlike the FTC,] we do not read the word “likely” to include something that has a low likelihood. We do not believe an interpretation [like the FTC’s] that does this is reasonable.”

I was still happily reveling in the refreshingly simple logic of the Court’s words when I read the brief filed in the 11th Circuit by LabMD counsel Douglas Meal and Michelle Visser of Ropes & Gray LLP. Finally, the legal rationale for and clear articulation of the unease I felt nearly three years ago:   Congress (through HIPAA) granted HHS the authority to regulate the data security practices of medical companies like LabMD using and disclosing PHI, and the FTC’s assertion of authority over such companies is “repugnant” to Congress’s grant to HHS.

Continuation of discussion of 11th Circuit case and filings by amicus curiae in support of LabMD to be posted as Part 2.

U.S. Representative Tim Murphy (R-PA) has been a vocal advocate for mental health reform for a number of years.  Part of his crusade is driven by his concern that the HIPAA privacy rule “routinely interferes with the timely and continuous flow of health information between health care providers, patients, and families, thereby impeding patient care, and in some cases, public safety.”  Congressman Murphy’s efforts have resulted in the inclusion in the recently-passed 21st Century Cures Act of a provision entitled “Compassionate Communications on HIPAA” targeted at improving understanding of what mental health information can be shared with family members and caregivers.

The 21st Century Cures Act streamlines the drug approval process, authorizes $4.8 billion in new health research funding, including $1.8 billion for Vice President Joe Biden’s “cancer moonshot” and $1.6 billion for brain diseases such as Alzheimer’s, and provides grants to combat the opioid epidemic.

Of most interest to readers of this blog, the Act also calls for the Department of Health and Human Services (HHS) to clarify the situations in which HIPAA permits health care professionals to communicate with caregivers of adults with a serious mental illness to facilitate treatment.  By December 13, 2017, the Secretary of HHS is required to issue guidance  regarding when such disclosures would require the patient’s consent; when the patient must be given an opportunity to object; when disclosures may be made based on the exercise of professional judgment regarding whether the patient would object when consent may not be obtained due to incapacity or emergency; and when disclosures may be made in the best interest of the patient when the patient is not present or is incapacitated.   HHS is directed to address communications to family members or other individuals involved in the care of the patient, including facilitating treatment and medication adherence.  Guidance is also required regarding communications when a patient presents a serious and imminent threat of harm to self or others.  HHS is directed to develop model training materials for healthcare providers, patients and their families.

The law incorporates the Substance Abuse and Mental Health Administration’s definition of the term “serious mental illness” as “a diagnosable mental, behavioral, or emotional disorder that results in serious functional impairment and substantially interferes with or limits one or more major life activities.”

Importantly, the law neither changes existing regulatory exceptions under HIPAA nor directs HHS to modify them.  Instead, it calls for further explanation of existing rules that are often poorly understood by providers, patients and caregivers alike or may actually be used inappropriately to thwart the flow of meaningful and helpful information leading to barriers to effective communication that would benefit patients and improve mental health outcomes.

An existing public safety exception permits a covered entity to use or disclose PHI if the covered entity, in good faith, believes the use or disclosure is necessary to prevent or lessen a serious and imminent threat to the health or safety of a person or the public; and the disclosure is made to a person or persons reasonably able to prevent or lessen the threat, including the target of the threat.

The existing exception for caregivers permits disclosures to a family member, other relatives, or a close personal friend of the individual, or any other person identified by the individual, but only regarding PHI that is directly relevant to such person’s involvement with the individual’s health care or payment for care.

PHI may also be disclosed when the patient is present and provides consent, does not object to a disclosure of PHI to another individual accompanying them when given the opportunity to object, or where the covered entity reasonably infers from the circumstances, based on the exercise of professional judgment, that the patient does not object to the disclosure.

Other existing exceptions address emergency situations as well as cases where the patient is incapacitated, and permit disclosure of only the PHI that is directly relevant to the other person’s involvement with the patient’s care or payment.

The new law falls short of Rep. Murphy’s previous legislative proposals.  In 2015, Murphy introduced a bill entitled the Helping Families In Mental Health Crisis Act. which he said would “allow the doctor or mental health professional to provide the diagnosis, treatment plans, appointment scheduling, and prescription information to the family member and known caregiver for a patient with a serious mental illness. This change would apply for those who can benefit from care yet are unable to follow through on their own self-directed care.”   This bill was passed by the House by a wide margin but was not enacted.

While the new law does not expand HIPAA exceptions, it does make it more likely that those exceptions already on the books will be more clearly understood and implemented in cases involving serious mental illness.

It may not come as a surprise that Congressman Tom Price, MD (R-GA), a vocal critic of the Affordable Care Act who introduced legislation to replace it last spring, was selected to serve as Secretary of the U.S. Department of Health and Human Services (HHS) in the Trump administration. What may come as a bit of a surprise is how Price’s proposed replacement bill appears to favor transparency over individual privacy when it comes to certain health care claim information.

Section 601 of the “Empowering Patients First” bill (Bill) would require a health insurance issuer to send a report including specific claim information to a health plan, plan sponsor or plan administrator upon request (Report). The Bill would require the Report to include all information available to the health insurance issuer that is responsive to the request including … protected health information [PHI] … .”

Since a “plan sponsor” includes an employer (in the case of an employee benefit plan established or maintained by the employer), the Bill would entitle an employer to receive certain PHI of employees and employees’ dependents, as long as the employer first certifies to the health insurance issuer that its plan documents comply with HIPAA and that the employer, as plan sponsor, will safeguard the PHI and limit its use and disclosure to plan administrative functions.

The Report would include claim information that would not necessarily be PHI (such as aggregate paid claims experience by month and the total amount of claims pending as of the date of the report), but could also include:

“A separate description and individual claims report for any individual whose total paid claims exceed $15,000 during the 12-month period preceding the date of the report, including the following information related to the claims for that individual –

(i) a unique identifying number, characteristic or code for the individual;

(ii) the amounts paid;

(iii) the dates of service; and

(iv) applicable procedure and diagnosis codes.”

After reviewing the Report and within 10 days of its receipt, the plan, plan sponsor, or plan administrator would be permitted to make a written request for additional information concerning these individuals. If requested, the health insurance issuer must provide additional information on “the prognosis or recovery if available and, for individuals in active case management, the most recent case management information, including any future expected costs and treatment plan, that relate to the claims for that individual.”

Price transparency has been studied as a potentially effective way to lower health care costs, and employers are often in a difficult position when it comes to understanding what they pay, as plan sponsors, to provide health insurance coverage to employees and their families.   Laws and tools that increase the transparency of health care costs are desperately needed, and the Empowering Patients First bill valiantly attempts to create a mechanism whereby plan sponsors can identify and plan for certain health care costs. On the other hand, in requiring the disclosure of procedure and diagnosis codes to employers, and in permitting employers to obtain follow-up “case management” information, the bill seems to miss the HIPAA concept of “minimum necessary”. Even if an employer certifies that any PHI it receives will be used only for plan administration functions, employees might be concerned that details regarding their medical condition and treatments might affect employment decisions unfairly and in ways prohibited by HIPAA.

If Dr. Price steps up to lead HHS in the coming Trump administration, let’s hope he takes another look at this Section from the perspective of HHS as the enforcer of HIPAA privacy protections.

Federal enforcement agencies are increasingly focusing on HIPAA breaches which involve mishandling of PHI by telecommuters.  Two recent cases illustrate the liability exposure resulting from inadequate oversight of staff working remotely.

Medical equipment supplier Lincare was fined $239,800 as a result of a breach which occurred when an employee left unprotected PHI in a car in the possession of her estranged husband.  An Administrative Law Judge upheld the penalty, noting that Lincare did not have policies in place requiring employees to safeguard medical information off-site.

In a second case, Cancer Care Group, an Indianapolis radiation oncology practice (CCG), entered into a $750,000 settlement with OCR after unencrypted backup tapes containing the PHI of more than 50,000 patients were stolen from a telecommuting employee’s vehicle.  OCR required the group to enter into a Corrective Action Plan that included conducting a risk analysis and developing and implementing policies and procedures to prevent similar occurrences.

My partners Michael Kline and Elizabeth Litten were quoted in the November issue of Medical Practice Compliance Alert by Marla Durben Hirsch in her article entitled “Call it telecommuting or working remotely, it needs a HIPAA policy.”

It is increasingly common for employers, including health care providers, to allow staff to work off site on a full- or part-time basis. While it’s most commonly seen as working from home, it includes anywhere but the office, including on a train, in a coffee shop, while traveling from patient to patient or elsewhere, points out attorney Michael Kline with Fox Rothschild in Princeton, N.J.

But it increases the risk of HIPAA violations because the practice is no longer in control of some of the technical and physical safeguards required by HIPAA’s security rule to protect the PHI, points out attorney Elizabeth Litten, also with Fox Rothschild.

“There are more opportunities for things to go wrong,” Litten warns.

Among the tips suggested in the article are the following:

  1. Have clear policies about what practices are accepted and how workers will protect the data;
  2. Determine what hardware and software will be allowed and how it must be configured;
  3. Make sure that the PHI can be password-protected, encrypted or otherwise segregated if the employee does not have a dedicated computer, so that family members who have access to the computer can’t view the PHI. “You don’t want it accessed by little children who want to look at Bubble Guppies,” says Kline.
  4. Double check that your insurance policies allow telecommuting;
  5. Include PHI off the premises as part of your practice’s overall risk assessments and management;
  6. Incorporate protection of PHI into your practice’s telecommuting policy;
  7. Get the promise to protect PHI in writing; and
  8. Monitor how telecommuters handle PHI.

Failure to design and implement effective telecommuting policies and procedures contributed to the breaches at Lincare and CCG and may have substantially increased the magnitude of the financial penalties.  Ideally, covered entities and business associates should anticipate issues with telecommuters and roll out appropriate rules before any PHI leaves the office, but if you already have team members working remotely, it is better to address these risks late than never.

 

 

 

According to the latest HIPAA-related guidance (Guidance) published by the U.S. Department of Health and Human Services (HHS), a cloud service provider (CSP) maintaining a client’s protected health information (PHI) is a business associate even when the CSP can’t access or view the PHI. In other words, even where the PHI is encrypted and the CSP lacks the decryption key, the CSP is a business associate because it maintains the PHI and, therefore, has HIPAA-related obligations with respect to the PHI.

HHS explains:

While encryption protects ePHI by significantly reducing the risk of the information being viewed by unauthorized persons, such protections alone cannot adequately safeguard the confidentiality, integrity and availability of the ePHI, such as ensuring that the information is not corrupted by malware, or ensuring through contingency planning that the data remains available to authorized persons even during emergency or disaster situations. Further, encryption does not address other safeguards that are also important to maintaining confidentiality, such as administrative safeguards to analyze the risks to the ePHI or physical safeguards for systems and services that may house the ePHI.”

It makes sense to treat a CSP as a business associate if it holds PHI, even if it cannot view or access that PHI. After all, a business associate is a person or entity that performs a function or service on behalf of a covered entity (or another business associate) that requires it to create, receive, maintain, or transmit PHI.

Still, HHS’s explanation is less than satisfying, perhaps because it rather crudely mixes together very distinct HIPAA obligations:  protecting the confidentiality of PHI, on one hand, and protecting the integrity and availability of PHI, on the other.

Under the HIPAA regulations, a business associate is only required to provide notice to the covered entity following the discovery of a breach of unsecured PHI. “Unsecured” PHI is defined as PHI that is “not rendered unusable, unreadable, or indecipherable to unauthorized persons through the use of a technology or methodology specified by the Secretary [of HHS]…” – in other words, PHI that is not encrypted at a level that meets HHS’s standards. The HIPAA regulations also say that a breach excludes a “disclosure of PHI where a covered entity or business associate has a good faith belief that an unauthorized person to whom the disclosure was made would not reasonably have been able to retain such information.” Obviously, a disclosure of PHI that cannot be viewed will also not be able to be retained.

HHS contends that encryption “alone cannot adequately safeguard the confidentiality” of the PHI, but, later in the Guidance, concedes that if the PHI is encrypted at a level that meets HHS’s standards, an unauthorized incident would fall within the breach “safe harbor” and would not need to be reported to the CSP’s customer. In such a case, the confidentiality of the PHI would be adequately safeguarded by encryption alone and the CSP arguably would not have an obligation to do anything else under HIPAA to protect the confidentiality of the PHI.  The CSP would have an ongoing obligations, however, to protect the integrity and accessibility of the encrypted PHI under HIPAA. The encryption “blindfold” will simplify the CSP’s obligations under HIPAA.

A CSP is in a tricky position if it holds encrypted PHI for a customer, but does not know that it holds it. The Guidance emphasizes that if a CSP maintains PHI for a customer that is a covered entity or business associate, it must execute a business associate agreement with the customer, and risks enforcement action (such as reported here) by the Office of Civil Rights (OCR) within HHS if it doesn’t have one.

“OCR recognizes that there may, however, be circumstances where a CSP may not have actual or constructive knowledge that a covered entity or another business associate is using its services to create, receive, maintain, or transmit ePHI.  The HIPAA Rules provide an affirmative defense in cases where a CSP takes action to correct any non-compliance within 30 days … of the time that it knew or should have known of the violation… This affirmative defense does not, however, apply in cases where the CSP was not aware of the violation due to its own willful neglect.”

Two key takeaways from the Guidance for a CSP? If you are blindfolded from viewing the data you maintain or transmit on behalf of a customer, or otherwise do not know whether the data might bring HIPAA obligations along with it, take reasonable steps to find out if the customer is a covered entity or business associate and whether the data includes PHI.  If so, execute a business associate agreement. Then, make sure the blindfold (i.e., encryption level) meets HHS’s standards and do NOT accept or have access to the decryption key.  This way, you can focus your HIPAA compliance efforts on protecting the integrity and accessibility of the data, not on protecting its confidentiality.

Last week, I blogged about a recent U.S. Department of Health and Human Services Office of Civil Rights (OCR) announcement on its push to investigate smaller breaches (those involving fewer than 500 individuals).   The week before that, my partner and fellow blogger Michael Kline wrote about OCR’s guidance on responding to cybersecurity incidents.  Today, TechRepublic Staff Writer Alison DeNisco addresses how a small or medium sized business (MSB) can deal with the heightened threat of OCR investigations or lawsuits emanating from a security breach.  Alison’s piece, “Security breaches:  How small businesses can avoid a HIPAA lawsuit”, is must-read for MSBs struggling to understand and prioritize their cybersecurity needs.

Michael and I spoke with Alison about the recent OCR pronouncements, and she pulled several of our comments together to create a list of tips for an SMB to consider to minimize HIPAA security breach headaches. The following 6 tips are excerpted from the full article:

  1. Hire a credible consultant to help you approach the issue, and how you would respond in the event of a breach. [In other words, perform your own security risk assessment, or, if impractical, hire an expert to perform one.]
  2. Document that you have policies and procedures in place to fight cyber crime. “If you didn’t document it, it didn’t happen,” Kline said.
  3. Stay informed of cybersecurity news in your industry, or join an association. Be aware of what other companies in your space are doing to protect themselves.
  4. Update your security settings on a regular basis, perhaps every time you add new employees or change systems, or on an annual basis.
  5. Present annually to your company board on where the company is in terms of cybersecurity protection, and where it needs to be to remain as safe as possible in the future.
  6. If you’re an IT consultant working with a healthcare organization, be clear with your client what you need to access and when, Litten said. “A client that has protected health information in its software should carefully delineate who has access to that software,” she added.

The article also quotes Ebba Blitz, CEO of Alertsec, who offers an equally important tip for the SMB dealing with employees’ use of mobile devices that contain or are used to transmit PHI:

You need a good plan for mitigating BYOD,” Blitz said. She further recommends asking employees to document their devices, so businesses can keep track of them and install security tools.

In summary, confronting ever-growing and evolving challenges of cybersecurity for SMBs is dependent upon serious planning, development and implementation of current policies and procedures, documentation of cybersecurity measures taken and entity-wide commitment to the efforts.

What you might have thought was not a big breach (or a big deal in terms of HIPAA compliance), might end up being a big headache for covered entities and business associates. In fact, it’s probably a good idea to try to find out what “smaller” breaches your competitors are reporting (admittedly not an easy task, since the “Wall of Shame” only details breaches affecting the protected health information (PHI) of 500 or more individuals).

Subscribers to the U.S. Department of Health and Human Services Office of Civil Rights (OCR) listserv received an announcement a couple of weeks ago that OCR would begin to “More Widely Investigate Breaches Affecting Fewer than 500 Individuals”. The announcement states that the OCR Regional Offices investigate all reported breaches involving PHI of 500 or more individuals and, “as resources permit”, investigate breaches involving fewer than 500.  Then the announcement warns that Regional Offices will increase efforts “to identify and obtain corrective action to address entity and systemic noncompliance” related to these “under-500” breaches.

Regional Offices will still focus these investigations on the size of the breach (so perhaps an isolated breach affecting only one or two individuals will not raise red flags), but now they will also focus on small breaches that involve the following factors:

*          Theft or improper disposal of unencrypted PHI;

*          Breaches that involve unwanted intrusions to IT systems (for example, by hacking);

*          The amount, nature and sensitivity of the PHI involved; and

*          Instances where numerous breach reports from a particular covered entity or business associate raise similar concerns

If any of these factors are involved in the breach, the reporting entity should not assume that, because the PHI of fewer than 500 individuals was compromised in a single incident, OCR is not going to pay attention. Instead, whenever any of these factors relate to the breach being reported, the covered entity (or business associate involved with the breach) should double or triple its efforts to understand how the breach occurred and to prevent its recurrence.  In other words, don’t wait for the OCR to contact you – promptly take action to address the incident and to try to prevent it from happening again.

So if an employee’s smart phone is stolen and it includes the PHI of a handful of individuals, that’s one thing. But if you don’t have or quickly adopt a mobile device policy following the incident and, worse yet, another employee’s smart phone or laptop is lost or stolen (and contains unencrypted PHI, even if it only contains that of a small handful of individuals), you may be more likely to be prioritized for investigation and face potential monetary penalties, in addition to costly reporting and compliance requirements.

This list of factors really should come as no surprise to covered entities and business associates, given the links included in the announcement to recent, well-publicized OCR settlements of cases involving smaller breaches.  But OCR’s comment near the very end of the announcement, seemingly made almost in passing, is enough to send chills down the spines of HIPAA compliance officers, if not induce full-blown headaches:

Regions may also consider the lack of breach reports affecting fewer than 500 individuals when comparing a specific covered entity or business associate to like-situated covered entities and business associates.”

In other words, if the hospital across town is regularly reporting hacking incidents involving fewer than 500 individuals, but your hospital only reported one or two such incidents in the past reporting period, your “small breach” may be the next Regional Office target for investigation. It will be the covered entity’s (or business associate’s) problem to figure out what their competitors and colleagues are reporting to OCR by way of the “fewer than 500” notice link.