NOTE: This blog includes discussions and explorations of legal issues. The content is intended for educational purposes only, and is not intended to be legal advice. You should always consult with a licensed attorney before taking any action that could affect or impair your legal rights.
A previous post covered some of the basics of . This week, I think it is important to address what appears to be some common confusion about implementing a system that complies with . The confusion is rooted in conflating what “the law” requires, and what is required by standards, which operate more as a basis for comparing one implementation to that of a relevant consensus.
A hypothetical situation in general terms — not necessarily a reflection of what is or is not the law — might be helpful. Say there is a new law that says that an individual’s favorite type and flavor of candy is protected information (and assume that people do not always go around sharing that information with anyone that will listen). Assume further, that the law says that any pastry chefs in possession of this information need to safeguard those preferences from being disclosed without the consent of the individual. Effectively, all the law has told us is that a certain kind of information has to be treated carefully so that it is not disclosed without the individual’s consent.
After the legislation passes, the Secretary of the United States Department of Sweet Teeth (“DST”) uses the enabling clause of the recently passed legislation to begin coming up with rules and regulations that expand on the law’s principal goal: mandating that an individual’s candy preferences be treated with care and handled securely.
The DST Secretary delegates the responsibility to draft and compile rules and regulations instructing PCI be kept safe, to the head of an agency inside the DST: the Center for Secured Bliss. This responsibility continues to be delegated until someone actually drafts rules that explains things like, once an individual shares his or her candy preferences with a pastry chef, the individual inherently consents to the pastry chef disclosing the PCI to the pastry chef’s apprentice for the purposes of making a pastry for that individual, but it does not allow the pastry chef to disclose the PCI to the pastry chef’s tax lawyer (we’re assuming there is no disparate tax treatment between the candies).
We can skip over the administrative law components of what happens next, but we will probably end up with rules in the Code of Federal Regulations (“CFR”) with some general requirements that might look like this:
- General requirements. [Pastry chefs] and [pastry chefs’ apprentices] must do the following:
(1) Ensure the confidentiality, integrity, and availability of all electronic protected [candy] information the [pastry chef] or [the pastry chef’s apprentice] creates, receives, maintains, or transmits.
(2) Protect against any reasonably anticipated threats or hazards to the security or integrity of such information.
(3) Protect against any reasonably anticipated uses or disclosures of such information that are not permitted or required under subpart E of this part.
(4) Ensure compliance with this subpart by their workforce.45 C.F.R. § 164.306(a) (2020).
And not far from those general requirements, we are likely to end up with a provision that recognizes that not every pastry chef is the same.
(b) Flexibility of approach.
(1) [Pastry chefs] and [pastry chefs’ apprentices] may use any security measures that allow the [pastry chef] or [the pastry chef’s apprentice] to reasonably and appropriately implement the standards and implementation specifications as specified in this subpart.
(2) In deciding which security measures to use, a [pastry chef] or [a pastry chef’s apprentice] must take into account the following factors:
(i) The size, complexity, and capabilities of the [pastry chef] or [the pastry chef’s apprentice].
(ii) The [pastry chef]’s or the [pastry chef’s apprentice]’s technical infrastructure, hardware, and software security capabilities.
(iii) The costs of security measures.
(iv) The probability and criticality of potential risks to electronic protected [candy] information.45 C.F.R. § 164.306(b) (2020).
Eventually we might come across a rule that provides guidance on “[t]echnical safeguards,” such as using a system that has “access control,” meaning only “persons or software programs that have been granted access” to electronic protected candy information are able to gain access.45 C.F.R. § 164.312(a)(1) (2020) (emphasis added to distinguish between “granted” and “gain”). Compliance with those technical safeguards might be supported by “[i]mplementation specifications” that require individuals be “[a]ssign[ed] a unique name and/or number for identifying and tracking user identity,”45 C.F.R. § 164.312(a)(2)(i) (2020). establishing “procedures for obtaining necessary electronic protected [candy] information during an emergency,”45 C.F.R. § 164.312(a)(2)(ii) (2020). having procedures that automatically log an authorized user out of the electronic system,45 C.F.R. § 164.312(a)(2)(iii) (2020). and using a system with “a mechanism to encrypt and decrypt electronic protected [candy] information.”45 C.F.R. § 164.312(a)(2)(iv) (2020).
We can safely assume that were a pastry chef to plaster an individual’s candy preferences on a billboard, we would not consider that as an example of treating the protected candy information (“PCI”) securely. It is also a pretty safe assumption that the pastry chef probably doesn’t need to store a sticky note with the PCI in a vault buried at Fort Knox, guarded 24/7 by a battalion of soldiers. In all likelihood, a consensus will probably begin to amass somewhere between the extremes of plastering PCI to a billboard and utilizing military-grade resources to protect PCI. Eventually that consensus is likely accumulate to the point where a standard emerges from a coalition, which could be called the Candy Preference Security Standards Association.
Applicability to HIPAA
Just about everything covered so far, has been with regard to what the law has to say about properly handling protected information (be it health or “candy”). And while there are plenty of specifics, gaps still exist between taking the rules and statutes relating to and applying them to a real-world setting.
Let’s start with an easy requirement: each individual accessing protected information is required to have a unique username. How unique does it have to be? How much identifying information needs to (or can) be easily available to an auditor of the system: should the username have a portion of the user’s actual name included in the username so the real identity of the username is apparent, or should it be a random assortment of letters and numbers in order to anonymize the user accounts and to help defend against hacks using social engineering?
What about access control? What sort of threats does the system need to defend against? Could the password simply be a few characters long, or does it need to be seventy-five characters long? Should two-factor authentication be made available to users? Should it be required? Should users be required to regularly change their password?
What about encryption? Could we use a Jefferson disk to encrypt our communications, or do we need something more complex? Does the information only need to be encrypted while “in transit,” or should it also be encrypted while “at rest?”
The point is that “the law” is not going to necessarily tell you what software “meets requirements.” It won’t even necessarily tell you what kind of encryption you might need. That usually comes about through best practices and industry standards.
It is compliance with these best practices and industry standards that are often — but not always — the flashy and major selling point of products and services that are advertised. That does not mean that the “flashy and major selling points” are somehow frivolous, shallow, inferior, or gimmicks, but it is important to understand what a product or service actually does, to whom or in what scenarios it applies, and in some cases, how it works to some degree. Obviously, you don’t want to pay for something that isn’t necessary or beneficial to your business.
In the previous post on the basics of , we saw that only applies to specific people (or groups of people): covered entities, business associates, and hybrid entities. In this post, we have come to understand that in order for those groups to comply with , they have to implement certain standards. What these standards look like, can and do change — so the systems and technologies used by covered entities, business associates, and hybrid entities yesterday, might not comply with the standards today, just as technology and systems that comply with today, might not be considered compliant tomorrow.
Providers of Health-Adjacent Services
The discussion of technical standards so far, has emphasized how compliance is achieved, but as mentioned earlier and previously, that inquiry is separate from whom must comply. It is this criterion that is more permanently fixed, and so it does not change simply by shifts in the industry.
Obviously, providers of health care are subject to , but what about people that provide health-adjacent services, like patient advocates? Assuming that patient advocates do not begin to provide services outside of their presently customary stable of services (which does not include providing health care), a patient advocate does not fit the definition of a For the purposes of HIPAA and PHI: a "covered entity” is a health plan, a health care clearinghouse, or a..., a business associate, or a hybrid entity. That means that, generally speaking, a patient advocate would not be subject to . In order for that to change, it would require an amendment either to the statutes or the regulations, newly interpreting typical patient advocacy activities as equivalent to providing health care services. Admittedly, it is conceivable that a situation could arise where a court found a “patient advocate” provided health care services, making the advocate subject to , simply by providing “typical” patient advocacy services, but such an interpretation would stretch the meaning of the “plain language” of the statute and move precedent away from a bright-line rule — both of which are the kinds of decisions that a judge is reluctant to embrace.
 45 C.F.R. § 164.306(a) (2020).
 45 C.F.R. § 164.306(b) (2020).
 45 C.F.R. § 164.312(a)(1) (2020) (emphasis added to distinguish between “granted” and “gain”).
 45 C.F.R. § 164.312(a)(2)(i) (2020).
 45 C.F.R. § 164.312(a)(2)(ii) (2020).
 45 C.F.R. § 164.312(a)(2)(iii) (2020).
 45 C.F.R. § 164.312(a)(2)(iv) (2020).
|↑1||45 C.F.R. § 164.306(a) (2020).|
|↑2||45 C.F.R. § 164.306(b) (2020).|
|↑3||45 C.F.R. § 164.312(a)(1) (2020) (emphasis added to distinguish between “granted” and “gain”).|
|↑4||45 C.F.R. § 164.312(a)(2)(i) (2020).|
|↑5||45 C.F.R. § 164.312(a)(2)(ii) (2020).|
|↑6||45 C.F.R. § 164.312(a)(2)(iii) (2020).|
|↑7||45 C.F.R. § 164.312(a)(2)(iv) (2020).|