Healthcare IT News Q&A with Mac – Security failings “a cultural issue”

AUSTIN, TX – Mac McMillan, CEO of Austin, Texas-based IT security firm CynergisTek and chair of the HIMSS Privacy & Security Policy Task Force, has some strong opinions about privacy protections in healthcare nowadays. The short version? Things could be a lot better.

You’ve said that if any other industry had this many privacy and security breaches, “heads would be rolling.” What is wrong with healthcare? Is the information landscape just too complicated? Or is it a matter of culture?

It definitely is not that it’s too complicated. And I think a lot of people try to hide behind that. They say, ‘We can’t do it, because our industry is so unique.’ Well, it’s not any more unique than the financial industry or the energy sector, or things that go on in the federal government. It’s just a matter of really figuring out how to do it, and settling the problem. Quite frankly, I think the problem is still that it’s a cultural issue. Look at the number of breaches. Insider snooping is still rampant in healthcare. And that’s a cultural issue. It’s a cultural issue that says, ‘It’s OK for me to be looking at things that I’m not supposed to be looking at.’ Which goes right to the core of how the industry views security and confidentiality. Providers do not see security as an imperative yet. That’s not across the board, obviously. There are some folks out there who are actually getting it and trying to do a good job. And they’re making the investment, and I think they’re seeing the benefit of doing that now. They’re realizing that there is a benefit to doing these things correctly. But that’s not the majority.

[See also: Top 5 most common gaps in healthcare data security and privacy.]

You just took part in a webinar on HIPAA security risk analyses. What should organizations keep in mind when undertaking them?

One of the things they really need to focus on is really understanding and appreciating where their personal health information is – even more than before. HIPAA has always had a requirement for organizations to map where their personal health information is, and to build their programs around that and understand what the risks are to that data, whether it’s at rest or in transit. But with the requirements being levied under the HITECH rules, it’s getting more and more specific. And there’s more and more emphasis being placed on really knowing where that data is, who’s touching it, where it’s being sent, the relevance or the appropriateness of where it’s going, and where it’s residing. And also whether or not they really, truly assessed the risk to that information properly – “reasonably” is the term the government uses – and then took appropriate measures to protect it. More and more, they’re looking at these breaches that are occurring. They’re going to conduct 150 audits between now and next October, spread out among providers and payers and business – they now have to be ready to receive either an audit or an investigation, depending on the circumstances, and they can’t just sit there and hide behind the face that they’ve done a cursory risk assessment.

What do you suspect most providers will discover after those analyses? Robust security, or flaws they need to fix?

I suspect almost all of them are going to still have areas that they need to address. That’s been our experience all along. When I look back at the risk assessments that our company has conducted over the last year, I’m just absolutely amazed at the amount of remediation that a lot of organizations are still having to do. And part of it is because they just really have not invested in security yet. A classic example is that we still have hospitals out there that don’t have a dedicated staff to the security function. They don’t have all of their policies and procedures documented. Many of them have not invested in the technologies that are necessary for them to put those controls in place. We still have organizations that are wrestling with whether to encrypt e-mail! You would think that would be a no-brainer. But when you look at the requirements being talked about for meaningful use Stage 2, they’re recommending even more security requirements be baked in, because there are incentives and penalties tied to that. And that, quite frankly has gotten people’s attention and gotten them to spend money on security.

What threats keep you up at night?

I think probably the top two are, first, insider abuse – individuals who have elevated privileges who can affect things, either inadvertently or deliberately, but who are not monitored. People like database administrators, who can bypass the audit capabilities of the application by going directly into the database itself and making changes. We have very little monitoring of what goes on there. That’s one of those 800-pound gorillas that no one wants to talk about. Now we’re beginning to talk about controls around applications, certified EHRs that have the ability to audit, and yet all of the data that’s associated with that sits in a database that a database administrator can go directly into, using their privileges and their level of access, and make changes that will never be seen. It’s one of those back doors that haven’t gotten a lot of attention yet. The second biggest issue in my mind is an organization just really not knowing or having a good handle on where their data is, and where it’s going. There’s a lot of emphasis being put on the EHR because of meaningful use. Well, the EHR typically is one of a handful of systems in a hospital environment. There are many more systems that have personal health information in them. Sometimes I think we do a disservice by placing so much emphasis on EHRs and obfuscating the rest of the environment where there’s still a lot of information that can be compromised and mishandled.

[See also: Privacy hindering EHR progress, say researchers.]

Ultimately, are you optimistic?

I am, actually. And I base that on a couple things. I lived through what I call the transformational process that occurred in the federal government back in the 1980s. In the early ‘80s we didn’t have really good information security programs or controls in the federal space. But we started that process, and began to build those requirements and began to require organizations to certify and accredit their systems. And we kind of went through the same transitional process. People said, ‘This stuff is getting in the way, it’s slowing us down, why do we need all this security?’ But we did it anyway. And we worked through that. It took a while, but today it’s second nature. Same thing with the banking industry. If you look at the industry in the ‘80s, it was the same thing. The larger banks that were more sophisticated were able to do it much faster. The mid-size and community banks were vey slow to adopt. It was very painful. But those requirements didn’t go away and eventually the industry evolved and adopted them and today has very standard ways that they do IT security. It’s an evolutionary process. We started this IT security evolution in healthcare in 2005, with the HIPAA security rule. So when you think about it, healthcare has only been doing this in an organized way since 2005. That was only six years ago. Back in 2005, nobody was talking about data leakage in healthcare. Nobody was talking about wireless or Web application security, or encrypting databases. Today they are. There’s been a big change in the industry in a short time. But are we done? We’re nowhere close to done. We still have a long way to go.

From: http://www.healthcareitnews.com/news/qa-security-failings-cultural-issue-says-expert?page=0,0

September 5th, 2011|

About the Author:

CynergisTek