By Cindy Atoji
March 4, 2008 | Network-based communications devices are proliferating, but pose security challenges for network executives and security administrators. Increasing HIPAA mandates for data protection raise questions about how to authentic individuals on porous networks and still give them the access they need.
Sean Convery, chief technology officer at Sunnyvale, Calif.-based idEngines, and an expert in secure network design, says that identity management technologies are one solution. User identity, says Convery, can be the foundation for network security, establishing control over access, and reducing time spent on audit and compliance. Digital HealthCare & Productivity spoke with Convery about identity-based network access solutions and how they can control access to sensitive data.
DHP: Why is there an increased need for authenticated role-based access in health care network organizations?
Convery: Besides the many network-connected devices, we’re seeing a variety of ways of connecting to the network. And we’re seeing not just doctors and other medical staff connecting to the network to access different systems, but we’re also seeing hospitals requiring guest access. So the ability to identify users when they connect to the network and to certain sets of privileges is a very emerging need in the health care industry to help ensure that the broad category of privacy systems are protected.
DHP: How are increasing HIPAA mandates driving health care providers to control patient, staff, and visitor Internet access to clinical applications and data?
Convery: I think what you’re seeing out of HIPAA are mandates for protection of data itself. The trick is that the data resides in a wide variety of applications; some of those are newer systems and others are legacy systems. We’re seeing some organizations focus less on the location of data and authenticating access to the data itself, and turn to authenticating individuals to the network and then have that as a base level of security.
Because the network is becoming more porous, the term de-perimeterization is almost cliché in the security industry. What this means is that we’re getting away from the classic firewall protecting the good guys on the inside, the bad guys on the outside. Many have gone to much more of a distributed security model, where you have security capabilities in a wireless device, in a laptop, Internet switch, or server. There’s much less “inside versus outside” because now you have collaboration going on across the Internet and inpatient records being shared outside the health care organization itself. A lot of that connectivity is driving the need for that first point of control: should you be on the network in the first place, and what broad categories should I allow you to access?
DHP: What are some best practices in secure network design, identity, and security?
Convery: User identity as an overall concept hasn’t been broadly applied to the network except for very recently, so we’re seeing this as an emerging best practice. Another best practice we’re seeing is really a need to support heterogeneous infrastructure. So back in the days of VPN (Virtual Private Network) solution, it was easy to have your VPN solution be all one vendor because you only had a couple of VPN devices. But now with role-based access control, you need to integrate with a wider variety of systems. So you might have a VPN infrastructure from NetScreen, a wireless infrastructure from Aruba, a wired infrastructure from Cisco, and being able to coordinate these is very challenging.
The last best practice is centralization of policy decisions and distribution of policy enforcement. Because of the multitude of security devices, rather than configuring each device individually to act in a secure role, a more appropriate approach is to centralize a policy server that can provide identity and authentication authorization checks to all of these devices throughout the network in a centralized way. That gives you a centralized audit that a HIPPA type of regulation would require.
DHP: How does security affect performance?
Convery: Security has often been thought of after the fact — organizations have said, “OK, I designed my network, and now I’m being forced to put some security on top of the network,” and this has been something that can limit performance. Security technology has improved in performance over time, but obviously the network has as well.
In the past, identity technologies have been very focused on the application itself, so you had applications and user directory or databases, and then you had all these proprietary protocols that the applications or the databases used to communicate. These kinds of systems required the coordination of a wide variety of end-user based systems and software as well as data center based software, so there was a lot of complexity. We try to make this simpler by elevating the decision to the network layer as opposed to trying to provide all that coordination at the application layer. So at the network layer, there are standards-based protocols like Radius and 802.1X, both out of IEEE, that allow you to integrate these systems but don’t have the pain of proprietary integration and software development but instead allow you to take advantage of these protocols that have been preexisting on network devices for many years.
DHP: What is the ROI that can be achieved with tightened security?
Convery: I think the ROI that we really see most directly is as you’re impacting the configuration and management of the network systems themselves, you have a huge reduction in the configuration complexity, time to deploy, and training time. Obviously on the back end of that, you have a huge savings from the audit perspective. So there’s a big return there as well.
DHP: You’re a board member of the OpenSEA Alliance. What is this group?
Convery: The OpenSEA Alliance stands for Open Secure Edge Access and it’s an organization founded to promote the development and adoption of 802.1X as a technology. This is a standard that allows you to identify individuals as you connect to the network. The Open SEA Alliance has a number of members — HP, Aruba, Symantec, who have joined forces to promote 802.1X as a technology and to develop a open source, cross platform, 802.1X client. So much like the Mozilla Firefox browser has provided a standards-based reference implementation that has moved the Web browser industry forward, we hope to do the same with the OpenSEA Alliance by ensuring interoperability across multiple vendors; the use and operation across multiple platforms; and become a reference implementation that can be embedded into other commercial and open source offerings.
We’re currently in a code freeze and we’re on the 2.0 phase of the client, which means that we think it’s very close to being ready for prime time in production deployment. We’re just waiting to get a little more feedback from the early trials that are taking place at different customer sites that validate that it’s truly stable and robust.