Risk Assessments: Expert Advice How NIST Guidance Can Help Healthcare Organizations

Too many healthcare providers fail to conduct comprehensive, timely risk assessments, as required under HIPAA as well as the HITECH Act, says security consultant Kate Borten, president of The Marblehead Group. But updated risk assessment guidance from the National Institute of Standards and Technology provides useful insights that providers can put to use, she says.

The HIPAA Security Rule, as well as the HITECH Act's electronic health record incentive program, require risk assessments. Nevertheless, many healthcare organizations find conducting a risk assessment "intimidating," she acknowledges in an interview with HealthcareInfoSecurity (transcript below). That's why the NIST guidance will prove helpful to many, she says.

By using the NIST SP 800-30 guidance, healthcare providers can get a much better understanding of fundamentals such as risks, threats and vulnerabilities, Borten suggests. She advises healthcare organizations to use the guidance to help craft a customized assessment and pinpoint a plan for mitigating risks.

In the interview, Borten:

  • Provides a detailed analysis of the guidance;
  • Discusses how frequently risks assessments should be conducted;
  • Describes circumstances when encryption of health data might not be needed, and how to document such decisions.

Before founding The Marblehead Group in 1999, Borten led the enterprisewide security program at Massachusetts General Hospital in Boston and established the first information security program at Beth Israel Deaconess Medical Center and its parent organization, CareGroup, as its chief information security officer.

The Risk Assessment Challenge

MARIANNE KOLBASUK MCGEE: When it comes to risk assessment, many healthcare organizations are lax. Why?

KATE BORTEN: Well I think this is one of the most problematic or challenging requirements in the [HIPAA] security rule. We have this guidance ... from NIST, but it's a little bit hard to get your arms around. I think it is intimidating to many organizations, especially on the provider side. ... They really haven't done anything like this before where it is has been a very structured or formal process for routinely looking for information security risks and dealing with them. I think, intuitively, many IT people do this as part of their work, but very few IT people over the years have had this as part of their training to understand the terminology and so on. ... It has been very challenging.

HITECH vs. HIPAA

MCGEE: What are the key differences between doing risk assessments to comply with the HITECH electronic health record incentive program Stage 2 rule versus the HIPAA Security Rule?

BORTEN: First of all, it's a requirement for [HITECH] Stage 1 as well. ... You have to attest in order to receive the incentive payments from Medicare or Medicaid - you have to actually say, "Yes, I have performed a security risk assessment and I have addressed any of the security issues of significance." It isn't just performing the risk assessment that is a requirement. It is finding the risks and then fixing them or mitigating them that you are required to basically legally attest to. ... So it is a big deal.

On the other hand, when the [HITECH] meaningful use regulations were written, the regulators fully understood that this is a requirement already [under HIPAA]. ... They are saying to the organizations, "We know you've been doing security risk assessments all along because this has been a federal regulation in effect since 2005. So come on folks, you're doing this, right?" So it is a little bit of a reminder or wake-up call that organizations are supposed to be doing this.

But the fact is that many, if not most, really aren't or they did it once way back in 2005. You know how new regulations come out and you ramp up and then you kind of shelve them and forget about them and they are dormant. So I think this is a jolt to a lot of organizations, but the actual risk assessment [requirement under HITECH] is really no different [than HIPAA]. The only difference is that when you're doing a risk assessment for meaningful use, the focus, the scope, the primary target is going to be that certified electronic health record, whether it's complete or a module. You are going to be looking at that as the primary focus, but not exclusive. You need to be looking at the surrounding controls as well. So it's not limited just to the questions: What are the system controls within the EHR? And have we implemented them in our organization appropriately?

So ... why do organizations struggle [with risk assessments]? One of the challenges is there is a lot of flexibility, but you have to define the scope. Where am I assessing risk? Is it in our entire environment or is it focused on a particular system? Just as internal auditors often use a process of rolling audits ... in a practical way, a lot of organizations do rolling risk assessments. So what is the target at this point? Well, when you're talking about meaningful use incentive payments and the risk assessment, the main focus should be on that certified EHR, but again not exclusively.

Encryption Issues

MCGEE: Because encryption is an addressable requirement in HIPAA, organizations must document what other alternative reasonable measures they are taking to protect data if they choose not to encrypt. Can you provide some examples of circumstances when an organization might choose not to encrypt patient information? And what is the best way to document those reasons for such a decision?

BORTEN: Well let's start with talking about addressable means. ... Addressable does not mean optional, and a lot of organizations continue to believe that is what it means. But if you read the HIPAA rule preamble and you listen to the folks who wrote these rules, that is absolutely not what it means. It just gives you a bit more flexibility. ... The fact is the security rule talks about requiring encryption in two separate places. In one place, it talks about encryption of protected health information as it is being transmitted. And in another place it talks about encryption of PHI at rest - sitting in a file or a data base or something like that. So that is an appropriate division. The technologies we use are different; the security risks are different under different circumstances.

So when we look at, for example, protecting PHI when it is being transmitted over a network ... what I always tell my clients is, at a minimum, your policy should say that any PHI or other confidential information must be encrypted over the Internet and over wireless networks. We know that there is heightened risk [involved]. Those are public airwaves and the public Internet, so we have to encrypt because ... my organization doesn't own the Internet or the airwaves.

So we have to add encryption as a method of access control when normal access control methods aren't within our ability to control. But for transmitting PHI over your local network, I know of very few organizations that do that [use encryption]. It's not necessary, it's not practical. Not that there is no risk, but we have a lot of other tools that we can use to reduce the risk of an inappropriate access or disclosure. For example, we use administrative and physical and technical controls on our own network. ... You have to have a unique user ID; you have to be granted access; you have to have an active directory account and so on. And we have sanction policies that say, if you break the rules you'll pay the price. We have physical controls - locked data closets or network closets where the network equipment ... critical to managing the network traffic is locked up and very few people have access. We have technical controls. We use those switches, for example, to create virtual networks that make it harder for unauthorized snooping on the network for example.

So the controls that we put in place as part of the information security program will help protect that information on our local network where we have control. So that is what you point to in an addressable specification for encrypting PHI in transmission - you say, "Yes, in those areas where we can't control use over public air - wireless networks and over the Internet - we require encryption. On our local network, we've got all these other things." ...

NIST Guidance

MCGEE: NIST recently issued updated guidance about risk assessments. What is the most important thing healthcare organizations should know about the new guidance?

BORTEN: This is one of my favorite websites. I strongly encourage anybody listening, if you don't know about this already, check out the computer security resource center, CSRC.NIST.gov. NIST is funded by our tax dollars. They fall under the Department of Commerce, and they write basically special publications ... on various security topics. ... Federal agencies are typically required to follow what these papers say. The rest of us in the private sector may or may not, but increasingly we are voluntarily adopting these.

For example, in the HITECH Act and the regulations that came out on breach notification, there is a lot of intricate detail about what actually constitutes a breach that requires notifying patients and the government, and there are some safe harbors. If, in fact, you have destroyed PHI, whether it's paper documents or electronic, following NIST recommendations ... you are in a safe harbor there. ... The same with encryption - if you are following the NIST encryption recommendations then you are safe [you don't need to report a breach of encrypted data]. Now, we're not talking about NSA DoD level security here, we're talking about reasonable business-level security. If you are following the NIST practices for disposal and for encryption, you're doing a good job.

So it's important that organizations be very much aware of this resource, use it, develop our own policies and choose technical solutions making sure that even if you really don't know anything about encryption, you better look for those magical terms - the names of encryption algorithms that are being used. ...

In September of this year, NIST released a revised guidance or document on performing risk assessment. This is NIST special publication 800-30. ... There has been an 800-30 on risk assessment for a number of years. They rewrote it. My sense is that the core content hasn't changed based on a lot of basic terminology and principles that have been around for a long time. But I think what has changed is maybe the tone of it is more business-oriented instead of government-oriented perhaps and it may be the tone and the presentation is just more accessible.

So I strongly recommend that organizations go look this document up and read it. It is not that long; the content is only about forty pages. There are only three chapters.

The fundamentals [are important for] organizations of any type and of any size, this is what IT people just don't get in IT training unless they have some specialized security training. ... Organizations struggle because they are not quite sure they know what they are doing, understanding these terms.

Risk is the result of a threat acting on a vulnerability. That is the basic formula. ... Remember, information security programs are composed of administrative controls, such as policies and procedures and training; physical controls, making sure the network equipment is locked up, server rooms and so on; and technical controls, such as passwords and firewalls and ... encryption. We [need to] look for weaknesses, [like the lack of] any policy requiring encryption under those circumstances I mentioned earlier, or [failure to] encrypt email. [If that's the case,] maybe we've got to look at a technical solution there, and when we do that, maybe we need to provide training so that users know why it is important and how to use it. [The key is] finding the vulnerabilities that could be exploited. If you don't have a policy, somebody might unwittingly be transmitting PHI over the Internet that is not encrypted.

Now, let's go back to the basics, the fundamentals. Threats act on vulnerabilities. ... What is a threat? There are insider threats. There are disgruntled employees, outsider threats, the hackers on the Internet. NIST says there are three sources of threats. People are, by far, the most difficult to deal with. But also we shouldn't forget natural and environmental threats. Natural threats include hurricanes. ... The environmental threats - we have some localized problem of power outages, burst water pipes, so threats act on a vulnerability to create a risk.

The risk assessment process is identifying the threats - general threats and threats that might be particular to your organization. And then the most important part of it is finding where you have vulnerabilities. Where do you have weaknesses and holes that could be taken advantage of? And when you identify a risk ... you weigh it on two different scales. First, you look at what is the likelihood of this event actually coming about. And secondly, if it were to happen, how bad would it be? What would be the criticality of it, the seriousness of the impact?

You can imagine in healthcare, there are not hard numbers. This is all very soft stuff. It's all relative, and it's all based on knowledge of healthcare organizations and how they operate, as well as what risks and vulnerabilities are out there. We know by looking at the Department of Health and Human Service's wall of shame [listing major breaches] that one of the most common scenarios is a lost or stolen laptop or other portable device or portable media that has PHI on it that hasn't been appropriately encrypted. So we know that is a high risk area. ...

And by the way, I always say, don't just limit your policies and procedures to PHI; expand that to cover all of your organization's confidential information. It is very likely that you are in a state that has a law that requires certain protections for individuals ... [regarding protecting] Social Security numbers, for example. Make sure that you don't silo your security program protections.

So the risk assessment identifies risks and weights them ... high, medium and low.

Then the next step beyond risk assessment ... is you then have to follow it up: Here is the report of all of the problems, what are we going to do about them? ... [For example,] you know there is a system that has really poor technical security controls, but we're on a path to replace it in six months. So we're not going to go back and change the technology. Maybe we will beef up some user awareness ... so that the users can take steps or be aware and try harder to protect access to the system. But if the system is going away, we're not going to invest a lot of time and money into changing it if we're bringing in something new.

So each one of these risks needs to be accompanied by a business decision: Are we going to live with it? ... There are always risks in the environment. The challenge here is deciding which of these risks are too big to live with ... and what are we going to do? Do we use administrative controls, physical controls, technical controls or combinations of those? What is the solution? What are the mitigating steps that we can take? And then you put action plans in place and carry those out. Simply doing a risk assessment without risk mitigation just doesn't make sense. They go hand-in-hand.

[The NIST guidance] chapter 3 talks about the process and how do you get ready for and conduct a risk assessment. It discusses things like making sure you've identified and documented what is the scope of this particular risk assessment. What are we looking for? What tools are we using?

Now there is a lot of good information here, but it doesn't provide a take-away checklist. Usually, unless you're very a small office, a checklist approach by itself is not sufficient. Certainly checklists can help ... but simply following a checklist to perform a risk assessment isn't sufficient in anything larger than a small office setting.

So unfortunately there is no great master plan that you can simply take away from this and turn right around and start using. You do have to think about it, customize it to your own organization, and essentially develop your own process based on these terms and principles that underlie security risk assessments.

Frequency of Assessments

MCGEE: You mentioned earlier that healthcare organizations do risk assessments and then they put it on a shelf and forget about it. How often should healthcare providers do risk assessments?

BORTEN: Well this is also a fuzzy area. The HIPAA Security Rule says to do this periodically and as needed. So periodically might be annual or it might be every three years. It depends on how the organization handles its risk assessments. If you are doing every single system in the place, it might be on a rolling basis and you might not get back to assessing a particular system for a while. I think there is a lot of latitude in there. ... As long as an organization takes it seriously and has a reasonably defensible plan and is following it, it's going to be okay.

The other component is you're also supposed to do a spot risk assessment when something is changing. So you implement a new system. For many organizations that are seeking the [HITECH] meaningful use incentive payments, they have implemented a new EHR or maybe upgraded one that they already had. .... This is an appropriate time [to update a risk assessment].

So HIPAA Security Rule compliance requires us to be doing this on some kind of periodic, ongoing basis as well as when something changes - [such as] you acquire a new business or you start doing things over the Internet you haven't done before. ...

So the message here is, covered entities and their business associates ... should routinely be doing risk assessment. Going back to this NIST document, I would recommend anyone in a position of responsibility for making sure this happens reads this document and understands more about risk assessment, and makes sure that it is going on in your organization on a routine basis. This is not, and was never intended to be, "we do it once and then forget about it." The environment changes constantly, especially in healthcare, especially in technology. So the lesson is: Keep it up. Just keep doing it.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing inforisktoday.co.uk, you agree to our use of cookies.