Individuals who have received notice of a HIPAA breach are often offered free credit monitoring services for some period of time, particularly if the protected health information involved included social security numbers.  I have not (yet) received such a notice, but was concerned when I learned about the massive Equifax breach (see here to view a post on this topic on our Privacy Compliance and Data Security blog).

The Federal Trade Commission’s Consumer Information page sums it up well:

If you have a credit report, there’s a good chance that you’re one of the 143 million American consumers whose sensitive personal information was exposed in a data breach at Equifax… .”

I read the news reports this morning, and decided to go on the Equifax site, equifaxsecurity2017.com, to see if my information may have been affected and to sign up for credit file monitoring and identify theft protection (the services are free to U.S. consumers, whether or not affected by the breach, for one year).

The Equifax site describes the breach and lets users click on a “Potential Impact” tab to find out whether their information “may have been impacted” by the breach. Users can find out by clicking on the “Check Potential Impact” link and following these steps:

  1. Click on the below link, “Check Potential Impact,” and provide your last name and the last six digits of your Social Security number.
  2. Based on that information, you will receive a message indicating whether your personal information may have been impacted by this incident.
  3. Regardless of whether your information may have been impacted, we will provide you the option to enroll in TrustedID Premier. You will receive an enrollment date. You should return to this site and follow the “How do I enroll?” instructions below on or after that date to continue the enrollment and activation process. The enrollment period ends on Tuesday, November 21, 2017.

Before satisfying my curiosity, though, I decided to click on the “Terms of Use”, that too-rarely-used link typically included at the bottom of a webpage that sets forth the quid pro quo of using a website. Perhaps it was because my law partner (and the firm’s Chief Privacy Officer), Mark McCreary, has instilled some cautiousness in me, or because I wondered if there might be a catch. Why would Equifax offer a free year of credit monitoring to everyone, even those not affected by the breach? What would Equifax get in return?

I skimmed the “Product Agreement and Terms of Use”, noted the bolded text requiring arbitration of disputes and waiving my right to participate in a class action, but wasn’t concerned enough to resist the urge to find out if my information was affected.

I then followed the “Getting Started” process by following the TrustedID Premier link, and quickly received a notice stating that my information “may have been impacted” and that I could enroll on September 11, 2017 (my “designated enrollment date”).

Not more than a couple of hours later, I came across an article warning of the legal rights consumers give up by signing up on Equifax’s website. The article describes the arbitration clause in the Terms of Use provisions, and reports on New York Attorney General Eric Schneiderman’s tweet stating that the arbitration provision is “unacceptable and unenforceable”. The article also reports that, today, Equifax updated the Terms of Use language to include a new provision allowing a user to write to Equifax to opt-out of the arbitration provision within 30 days of the date the user first accepts the Product Agreement and Terms of Use.

My curiosity got the best of me and I now know I’m on the “affected persons” list, but I haven’t yet signed up for my free TrustedID Premier credit monitoring service. I have the weekend to decide whether to sign up for the service, and 30 days from Monday (if I actually sign up for the service) to decide whether to accept the “cost” of agreeing to binding arbitration.

It was the wallet comment in the response brief filed by the Federal Trade Commission (FTC) in the U.S. Court of Appeals for the 11th Circuit that prompted me to write this post. In its February 9, 2017 filing, the FTC argues that the likelihood of harm to individuals (patients who used LabMD’s laboratory testing services) whose information was exposed by LabMD roughly a decade ago is high because the “file was exposed to millions of users who easily could have found it – the equivalent of leaving your wallet on a crowded sidewalk.”

However, if one is to liken the LabMD file (referred to throughout the case as the “1718 File”) to a wallet and the patient information to cash or credit cards contained in that wallet, it is more accurate to describe the wallet as having been left on the kitchen counter in an unlocked New York City apartment. Millions of people could have found it, but they would have had to go looking for it, and would have had to walk through the door (or creep through a window) into a private residence to do so.

I promised to continue my discussion of LabMD’s appeal in the U.S. Court of Appeals for the 11th Circuit of the FTC’s Final Order back in January (see prior post here), planning to highlight arguments expressed in briefs filed by various amici curiae in support of LabMD.   Amici include physicians who used LabMD’s cancer testing services for their patients while LabMD was still in business, the non-profit National Federation of Independent Business, the non-profit, nonpartisan think tank TechFreedom, the U.S. Chamber of Commerce, and others. Amici make compelling legal arguments, but also emphasize several key facts that make this case both fascinating and unsettling:

The FTC has spent millions of taxpayer dollars on this case – even though there were no victims (not one has been identified in over seven years), LabMD’s data security practices were already regulated by the HHS under HIPAA, and, according to the FTC’s paid litigation expert, LabMD’s “unreasonableness” ceased no later than 2010. During the litigation, …   a whistleblower testified that the FTC’s staff … were bound up in collusion with Tiversa [the cybersecurity firm that discovered LabMD’s security vulnerability, tried to convince LabMD to purchase its remediation services, then reported LabMD to the FTC], a prototypical shakedown racket – resulting in a Congressional investigation and a devastating report issued by House Oversight Committee staff.” [Excerpt from TechFreedom’s amicus brief]

An image of Tiversa as taking advantage of the visible “counter-top wallet” emerges when reading the facts described in the November 13, 2015 Initial Decision of D. Michael Chappell, the Chief Administrative Law Judge (ALJ), a decision that would be reversed by the FTC in the summer of 2016 when it concluded that the ALJ applied the wrong legal standard for unfairness. The ALJ’s “Findings of Fact” (which are not disputed by the FTC in the reversal, notably) include the following:

“121. On or about February 25, 2008, Mr. Wallace, on behalf of Tiversa, downloaded the 1718 File from a LabMD IP address …

  1. The 1718 File was found by Mr. Wallace, and was downloaded from a peer-to-peer network, using a stand alone computer running a standard peer-to-peer client, such as LimeWire…
  2. Tiversa’s representations in its communications with LabMD … that the 1718 File was being searched for on peer-to-peer networks, and that the 1718 File had spread across peer-to-peer networks, were not true. These assertions were the “usual sales pitch” to encourage the purchase of remediation services from Tiversa… .”

The ALJ found that although the 1718 File was available for peer-to-peer sharing via use of specific search terms from June of 2007 through May of 2008, the 1718 File was actually only downloaded by Tiversa for the purpose of selling its security remediation services. The ALJ also found that there was no contention that Tiversa (or those Tiversa shared the 1718 File with, namely, a Dartmouth professor working on a study and the FTC) used the contents of the file to harm patients.

In short, while LabMD may have left its security “door” unlocked when an employee downloaded LimeWire onto a work computer, only Tiversa actually walked through that door and happened upon LabMD’s wallet on the counter-top. Had the wallet been left out in the open, in a public space (such as on a crowded sidewalk), it’s far more likely its contents would have been misappropriated.

As she has done in January for several years, our good friend Marla Durben Hirsch quoted my partner Elizabeth Litten and me in Medical Practice Compliance Alert in her article entitled “MIPS, OSHA, other compliance trends likely to affect you in 2017.” For her article, Marla asked various health law professionals to make predictions on diverse healthcare matters including HIPAA and enforcement activities. Full text can be found in the January 2017 issue, but excerpts are included below.

Marla also wrote a companion article in the January 2017 issue evaluating the results of predictions she published for 2016. The 2016 predictions appeared to be quite accurate in most respects. However, with the new Trump Administration, we are now embarking on very uncertain territory in multiple aspects of healthcare regulation and enforcement. Nevertheless, with some trepidation, below are some predictions for 2017 by Elizabeth and me taken from Marla’s article.

  1. The Federal Trade Commission’s encroachment into privacy and security will come into question. Litten said, “The new administration, intent on reducing the federal government’s size and interference with businesses, may want to curb this expansion of authority and activity. Other agencies’ wings may be clipped.” Kline added, “However, the other agencies may try to push back because they have bulked up to handle this increased enforcement.”
  2. Telemedicine will run into compliance issues. As telemedicine becomes more common, more legal problems will occur. “For instance, the privacy and the security of the information stored and transmitted will be questioned,” says Litten. “There will also be heightened concern of how clinicians who engage in telemedicine are being regulated,” adds Kline.
  3. The risks relating to the Internet of things will increase. “The proliferation of cyberattacks from hacking, ransomware and denial of service schemes will not abate in 2017, especially with the increase of devices that access the Internet, known as the ‘Internet of things,’ warns Kline. “More devices than ever will be networked, but providers may not protect them as well as they do other electronics and may not even realize that some of them —such as newer HVAC systems, ‘smart’ televisions or security cameras that can be controlled remotely — are also on the Internet and thus vulnerable,” adds Litten. “Those more vulnerable items will then be used to infiltrate providers’ other systems,” Kline observes.
  4. More free enterprise may create opportunities for providers. “For example, there may not be as much of a commitment to examine mergers,” says Kline. “The government may allow more gathering and selling of data in favor of business interests over privacy and security concerns,” says Litten.

The ambitious and multi-faceted foray by the Trump Administration into the world of healthcare among its many initiatives will make 2017 an interesting and controversial year. Predictions are always uncertain, but 2017 brings new and daunting risks to the prognosticators.  Nonetheless, when we look back at 2017, perhaps we may be saying, “The more things change, the more they stay the same.”

It was nearly three years ago that I first blogged about the Federal Trade Commission’s “Wild West” data breach enforcement action brought against now-defunct medical testing company LabMD.   Back then, I was simply astounded that a federal agency (the FTC) with seemingly broad and vague standards pertaining generally to “unfair” practices of a business entity would belligerently gallop onto the scene and allege non-compliance by a company specifically subject by statute to regulation by another federal agency. The other agency, the U.S. Department of Health and Human Services (HHS), has adopted comprehensive regulations containing extremely detailed standards pertaining to data security practices of certain persons and entities holding certain types of data.

The FTC Act governs business practices, in general, and has no implementing regulations, whereas HIPAA specifically governs Covered Entities and Business Associates and their Uses and Disclosures of Protected Health Information (or “PHI”) (capitalized terms that are all specifically defined by regulation). The HIPAA rulemaking process has resulted in hundreds of pages of agency interpretation published within the last 10-15 years, and HHS continuously posts guidance documents and compliance tools on its website. Perhaps I was naively submerged in my health care world, but I had no idea back then that a Covered Entity or Business Associate could have HIPAA-compliant data security practices that could be found to violate the FTC Act and result in a legal battle that would last the better part of a decade.

I’ve spent decades analyzing regulations that specifically pertain to the health care industry, so the realization that the FTC was throwing its regulation-less lasso around the necks of unsuspecting health care companies was both unsettling and disorienting. As I followed the developments in the FTC’s case against LabMD over the past few years (see additional blogs here, here, here and here), I felt like I was moving from the Wild West into Westworld, as the FTC’s arguments (and facts coming to light during the administrative hearings) became more and more surreal.

Finally, though, reality and reason have arrived on the scene as the LabMD saga plays out in the U.S. Court of Appeals for the 11th Circuit. The 11th Circuit issued a temporary stay of the FTC’s Final Order (which reversed the highly-unusual decision against the FTC by the Administrative Law Judge presiding over the administrative action) against LabMD.

The Court summarized the facts as developed in the voluminous record, portraying LabMD as having simply held its ground against the appalling, extortion-like tactics of the company that infiltrated LabMD’s data system. It was that company, Tiversa, that convinced the FTC to pursue LabMD in the first place. According to the Court, Tiversa’s CEO told one of its employees to make sure LabMD was “at the top of the list” of company names turned over to the FTC in the hopes that FTC investigations would pressure the companies into buying Tiversa’s services. As explained by the Court :

In 2008, Tiversa … a data security company, notified LabMD that it had a copy of the [allegedly breached data] file. Tiversa employed forensic analysts to search peer-to-peer networks specifically for files that were likely to contain sensitive personal information in an effort to “monetize” those files through targeted sales of Tiversa’s data security services to companies it was able to infiltrate. Tiversa tried to get LabMD’s business this way. Tiversa repeatedly asked LabMD to buy its breach detection services, and falsely claimed that copies of the 1718 file were being searched for and downloaded on peer-to-peer networks.”

As if the facts behind the FTC’s action weren’t shocking enough, the FTC’s Final Order imposed bizarrely stringent and comprehensive data security measures against LabMD, a now-defunct company, even though its only remaining data resides on an unplugged, disconnected computer stored in a locked room.

The Court, though, stayed the Final Order, finding even though the FTC’s interpretation of the FTC Act is entitled to deference,

LabMD … made a strong showing that the FTC’s factual findings and legal interpretations may not be reasonable… [unlike the FTC,] we do not read the word “likely” to include something that has a low likelihood. We do not believe an interpretation [like the FTC’s] that does this is reasonable.”

I was still happily reveling in the refreshingly simple logic of the Court’s words when I read the brief filed in the 11th Circuit by LabMD counsel Douglas Meal and Michelle Visser of Ropes & Gray LLP. Finally, the legal rationale for and clear articulation of the unease I felt nearly three years ago:   Congress (through HIPAA) granted HHS the authority to regulate the data security practices of medical companies like LabMD using and disclosing PHI, and the FTC’s assertion of authority over such companies is “repugnant” to Congress’s grant to HHS.

Continuation of discussion of 11th Circuit case and filings by amicus curiae in support of LabMD to be posted as Part 2.

Our partner Elizabeth Litten and I had a recent conversation with our good friend Marla Durben Hirsch who quoted us in her Medical Practice Compliance Alert article, “Beware False Promises From Software Vendors Regarding HIPAA Compliance.” Full text can be found in the February, 2016, issue, but some excerpts regarding 6 tips to reduce the risk of obtaining unreliable HIPAA compliance and protection software from vendors are summarized below.

As the backdrop for her article, Marla used the $250,000 settlement of the Federal Trade Commission (the “FTC”) with Henry Schein Practice Solutions, Inc. (“Henry Schein”) for alleged false advertising that the software it marketed to dental practices provided “industry-standard encryption of sensitive patient information” and “would protect patient data” as required by HIPAA. Elizabeth has already posted a blog entry on aspects of the Henry Schein matter that may be found here.

During the course of our conversation with Marla, Elizabeth observed, “This type of problem [risk of using unreliable HIPAA software vendors] is going to increase as more physi­cians and health care professionals adopt EHR systems, practice management systems, patient portals and other health IT.”

The six tips listed by Marla are summarized as follows:

  1. Litten and Kline:

Vet the software vendor regarding the statements it’s making to secure and protect your data. If the vendor is claiming to provide NIST-standard encryption, ask for proof. See what it’s saying in its marketing brochures. Check references, Google the company for lawsuits or other bad press, and ask whether it suffered a security breach and if so, how the vendor responded.

 

  1. Kline: Make sure that you have a valid business associate agreement that protects your interests when the software vendor is a business associate.” However, a provider must be cautious to determine first whether the vendor is actually a business associate before entering into a business associate agreement.

 

  1. Litten: “Check whether your cyberinsurance covers this type of contingency. It’s possible that it doesn’t cover misrepresentations, and you should know where you stand.”

 

  1. Litten and Kline: See what protections a software vendor contract may provide you.”   For instance, if a problem occurs with the software or it’s not as advertised, if the vendor is not obligated to provide you with remedies, you might want to add such protections, using the Henry Schein settlement as leverage.

 

  1. Litten and Kline: Don’t market or advertise that you provide a level of HIPAA protection or compliance on your web-site, Notice of Privacy Practices or elsewhere unless you’re absolutely sure that you do so.” The FTC is greatly increasing its enforcement activity.

 

  1. Kline:Look at your legal options if you find yourself defrauded.” For instance, the dentists who purchased the software [from Henry Schein] under allegedly false pretenses have grounds for legal action.

The primary responsibility for compliance with healthcare data privacy and security standards rests with the covered entity. It must show reasonable due diligence in selecting, contracting with, and monitoring performance of, software vendors to avoid liability for the foibles of its vendors.

Our partner Elizabeth Litten and I were quoted by our good friend Marla Durben Hirsch in her article in Medical Practice Compliance Alert entitled “6 Compliance Trends Likely to Affect Your Practices in 2016.” Full text can be found in the January 13, 2016, issue, but a synopsis is below.

For her article, Marla asked various health law professionals to make predictions on matters such as HIPAA enforcement, the involvement of federal agencies in privacy and data security, and actions related to the Office for Civil Rights (“OCR”) of the federal Department of Health and Human Services (“HHS”).

After the interview with Marla was published, I noted that each of Elizabeth’s and my predictions described below happened to touch on our anticipation of the expansion by HHS and other federal agencies of their scope and areas of healthcare privacy regulation and enforcement. I believe that this trend is not a coincidence in this Presidential election year, as such agencies endeavor to showcase their regulatory activities and enlarge their enforcement footprints in advance of possible changes in the regulatory environment under a new administration in 2017. If an agency can demonstrate effectiveness and success during 2016 in new areas, it can make a stronger case for funding human and other resources to continue its activities in 2017 and thereafter.

Our predictions that were quoted by Marla follow.

Kline Prediction: Privacy and data enforcement actions will receive more attention from federal agencies outside of the OCR.

In light of the amount of breaches that took place in 2015, the New Year will most likely see an increase of HIPAA enforcement. However, regulators outside of healthcare –such as the Department of Homeland Security, the Securities and Exchange Commission and the Federal Communications Commission — also try to extend their foothold into the healthcare compliance realm, much in a way that the Federal Trade Commission has.

Litten Prediction: The Department of Justice (DOJ) and the OCR will focus more on individual liability

In September of 2015, the DOJ announced through the Yates Memo, that they would be shifting their strategy to hold individuals to a higher level of accountability for an entity’s wrongdoing. The OCR has also mentioned that they will focus more on individuals who violate HIPAA. “They’re trying to put the fear in smaller entities. A small breach is as important as a big one,” says Litten.

Kline Prediction: OCR will examine business associate relationships.

The HIPAA permanent audit program, which has been delayed by the OCR, will be rolled out in 2016 and will scrutinize several business associates. In turn, all business associate relationships will receive increased attention.   According to Kline, “There will be more focus on how you selected and use a business associate and what due diligence you used. People also will be more careful about reviewing the content of business associate agreements and determining whether one between the parties is needed.”

We shall continue to observe whether the apparent trend of federal agencies to grow their reach into regulation of healthcare privacy continues as we approach the Presidential election.

Already many blogs and articles have been written on Chief Administrative Law Judge D. Michael Chappell’s November 13, 2015 92-page decision exonerating LabMD from the FTC’s charges that it failed to provide reasonable and appropriate security for personal information maintained on its computer networks in violation of Section 5(a) of the FTC Act.  A number of the commentators accurately point out that this ruling makes it clear the FTC does not have unbridled enforcement authority over allegedly “unfair” data security cases.

The FTC would have had Chief Judge Chappell believe that liability should be imposed for conduct that is theoretically “likely” to cause consumer harm, despite its inability to identify a single instance of consumer harm over the course of 7 years since the allegedly “unfair” conduct occurred. Judge Chappell refused to drink the FTC’s Kool-Aid, though, restoring my faith in the ability of logic and rational thinking to outweigh agency fluff and bluster in an administrative judicial proceeding.  Section 5(n) of the FTC Act requires a showing that the conduct “caused, or is likely to cause, substantial injury to consumers,” and while the Act doesn’t define the word “likely”,  Judge Chappell concluded that:

The term “likely” in Section 5(n) does not mean that something is merely possible.  Instead, “likely” means that it is probable that something will occur.”

Hardly complex legal reasoning – just basic, simple common sense.

We blogged on this case and the FTC’s enforcement activities in the data security realm in October of 2014 (read here), as well as in March, April, May and June of 2014 (read here), and have closely followed LabMD founder Michael Daugherty’s tireless battle to defend his small, now-defunct cancer testing company from what has seemed an outrageous abuse of regulatory enforcement power from the beginning.

It’s refreshing (and relieving, for other businesses facing FTC investigations over what may seem to be minor and inconsequential infractions) that Judge Chappell carefully considered the evidence presented over the course of approximately two years and injected intelligence and reason into a case that seemed shockingly deficient in these traits.  Thank goodness Judge Chappell refused to drink from the FTC’s “possible-means-likely” cup of legal reasoning.  However, the Judge’s painstakingly articulated factual findings, enumerated in 258 paragraphs, reveal the unsettling back-story behind this case.

The FTC’s case was built around information provided to it by a company affiliated with Tiversa, a business involved in finding security vulnerabilities in companies’ computer networks and then selling remediation services to the companies to prevent similar infiltrations.  LabMD declined Tiversa’s offer to sell it remediation services.  Chief Judge Chappell found:

158.  Mr. Boback’s motive to retaliate against LabMD for refusing to purchase remediation services from Tiversa … resulted in Tiversa’s decision to include LabMD in the information provided to the FTC… .”

The FTC may be wishing it had heeded the warning and advice of FTC Commissioner J. Thomas Rosch, who had initially suggested (in his Dissenting Statement issued June 21, 2012) that FTC staff should not rely on Tiversa for evidence or information related to LabMD, given Tiversa’s business model and prior attempts to sell its services to LabMD, in order to avoid the appearance of impropriety.  Instead, FTC staff readily accepted Tiversa’s Kool-Aid, relying on evidence it might have realized was tainted at the outset.

Again, hardly complex reasoning – just basic, simple common sense:  if it doesn’t smell or taste right, don’t drink the Kool-Aid.

Regardless of whether the case is appealed and its ultimate outcome, the LabMD ruling  may serve as a precedent to encourage others to challenge the FTC’s enforcement authority under Section 5, authority that the agency has expanded over the years through consent decrees, particularly where there is no evidence that allegedly inadequate security practices have resulted in (or will probably result in) consumer harm.

A recent post on this blog by our partner Elizabeth Litten was quoted in the Dissenting Statement (the “Dissent”) of FTC Commissioner Maureen K. Ohlhausen in the Matter of Nomi Technologies, Inc., Matter No. 1323251. Ms. Ohlhausen disagreed with the views of the majority of the Commissioners in the Matter because she believed that

. . . by applying a de facto strict liability deception standard absent any evidence of consumer harm, the proposed complaint and order inappropriately punishes a company that acted consistently with the FTC’s privacy goals by offering more transparency and choice than legally required.

To buttress her viewpoint, Ms. Ohlhausen quoted as follows from Elizabeth’s post, which was referenced at footnote 9:

In response to the case’s release, one legal analyst [Elizabeth Litten] advised readers that ‘giving individuals more information is not better’ and that where notice is not legally required, companies should ‘be sure the benefits of notice outweigh potential risks.’

The takeaway from the FTC decision in Nomi and the Dissent appears to be that, in setting and publishing privacy policies, an organization should carefully consider whether adopting standards in excess of legal requirements is advisable if there is a reasonable possibility that the organization may find such standards difficult or costly to attain and maintain, thereby increasing the risk of regulatory scrutiny and sanctions.

We know by now that protected health information (PHI) and other personal information is vulnerable to hackers.  Last week, the Washington Times reported that the Department of Health and Human Services (HHS), the agency responsible for HIPAA enforcement, had suffered security breaches at the hands of hackers in at least five separate divisions over the past three years.  The article focused on a House Committee on Energy and Commerce report that described the breaches as having been relatively unsophisticated and the responsible security officials as having been unable to provide clear information regarding the security incidents.

We know it’s not a question of “if” sensitive information maintained electronically will be compromised by a hacking or other type of cyber security incident, but “when” — regardless of who maintains it — and how destructive an incident it will be. Even HHS and its operating divisions, which include both the Office of Civil Rights (OCR), charged with protecting PHI privacy and security, and the Food and Drug Administration (FDA), the country’s principal consumer protection and health agency, are vulnerable.

Just one day before its coverage of the House Committee report on the cyber security vulnerabilities that exist within the very government agencies charged with protecting us, the Washington Times reported on an even more alarming cyber security risk: the vulnerability of common medical devices, such as x-ray machines and infusion pumps, to hacks that could compromise not just the privacy and security of our health information, but our actual physical health.

This report brought to mind a recent report on the ability of hackers to remotely access the control systems of automobiles.  While the thought of losing control of my car while driving is terrifying, the realization that medical devices are vulnerable to hackers while being used to diagnose or treat patients is particularly creepy.  The two situations may present equally dangerous scenarios, but hacking into a medical device is like hacking into one’s physical being.

So while it’s one thing to have PHI or other sensitive information compromised by a hacking incident, it’s much more alarming to think that one’s health status, itself, could be compromised by a hacker.

This case has nothing to do with HIPAA, but should be a warning to zealous covered entities and other types of business entities trying to give patients or consumers more information about data privacy than is required under applicable law.  In short, giving individuals more information is not better, especially where the information might be construed as partially inaccurate or misleading.

The Federal Trade Commission (FTC) filed a complaint against Nomi Technologies, Inc., a retail tracking company that placed sensors in clients’ New York City-area retail stores to automatically collect certain data from consumers’ mobile devices as they passed by or entered the stores.  Nomi’s business model was publicized in a July 2013 New York Times article.  The complaint alleged, among other things, that although Nomi’s published privacy policy stated that Nomi would “allow consumers to opt out of Nomi’s [data tracking] service on its website as well as at any retailer using Nomi’s technology,” Nomi actually only allowed consumers to opt-out on its website — no opt-out mechanism was available at the clients’ retail stores.

The FTC voted 3-2 to accept a consent order  (published for public comment on May 1, 2015) from Nomi under which Nomi shall not:

[M]isrepresent in any manner, expressly or by implication:  (A) the options through  which, or the extent to which, consumers can exercise control over the collection, use, disclosure, or sharing of information collected from or about them or their computers or devices, or (B) the extent to which consumers will be provided notice about how data from or about a particular consumer, computer, or device is collected, used, disclosed, or shared.”

The odd aspect of this complaint and consent order is that Nomi did not track or maintain information that would allow the individual consumers to be identified.  The media access control (MAC) address broadcast by consumers’ mobile devices as they passed by or entered the stores was cryptographically “hashed” before it was collected, creating a unique identifier that allowed Nomi to track the device without tracking the consumer him/herself.   As dissenting Commissioner Maureen Ohlhausen points out, Nomi, as “a third party contractor collecting no personally identifiable information, Nomi had no obligation to offer consumers an opt out.”  The majority, however, focuses on the fact that the opt out was partially inaccurate, then leaps to the conclusion that the inaccuracy was deceptive under Section 5 of the FTC Act, without pausing to reflect on the fact that the privacy policy and opt out process may not have been required by law in the first place.

So while many HIPAA covered entities and other businesses may want to give consumers as much information as possible about data collection, the lesson here is twofold:  first, make sure the notice is required under applicable law (and, if it’s not, be sure the benefits of notice outweigh potential risks); and, second, make sure the notice is 100% accurate to avoid FTC deceptive practices claims.