Many of the applications we develop at Echobind involve accessing protected health information and patient data. We don’t take this responsibility lightly. As such, HIPAA compliance is a phrase used quite a bit between our team and our clients.
If you’re developing a web or mobile app that accesses health data, you can find a number of resources online recommending dos and donts. We’ve linked several sites throughout this article. Our focus for today is helping companies understand how they can reduce or completely eliminate their exposure and risk.
If you’re reading this, you’re probably already aware of what HIPAA is, but here’s a formal description:
HIPAA (Health Insurance Portability and Accountability Act) is US legislation that provides security provisions and data privacy for safeguarding medical information.
The part of HIPAA that most people know and refer to are the national standards that were set by the US government to protect electronic health care transactions. To summarize, any breach is bad, but a breach accessing medical records and health information is significantly worse.
HIPAA applies to healthcare organizations, health clearinghouses, employer sponsored health plans, insurers, medical service providers, and any business associate (the broad term for consultants, accountants, devops, etc) that has access to personal health information. Even if you’re not directly working on a healthcare application or don’t fall in one of the previously listed groups, HIPAA guidelines are generally helpful security precautions to take into consideration for any app.
Great question. First, you’ll probably end up on the government sponsored HIPAA wall of shame. Second, it’s probably going to cost you as the average data breach in 2017 costs $3.62 million. That same study showed the odds of an organization having a data breach are 1 in 4.
So now that we know a bit more about HIPAA, who it applies to and what a bad case scenario looks like, let’s talk about how we can reduce our risk.
The best rule of thumb we can share is to limit your PHI access to an absolute minimum. Organizations have to constantly ask themselves the question: Do we really need this data to achieve our goal?
For example, if your API strategy involves passing back related records that you aren’t directly using, don’t. If you’ve developed an app, properly dispose of the cache when a user logs out and make sure you are automatically logging them out if they don’t themselves. Spend the extra time and resources to properly log each and every time an individual accesses PHI and have a sane way to retrieve that information when you need it.
Many of these tips seem obvious to developers when they hear them, but developers learn to build apps using conventions to optimize for speed and user experience. Developers may need to write more code and users may need login more often. That’s alright. We’re trading some convenience for a higher level of security.
In many cases, companies want to share data with higher ups or third party partners for all sorts of legitimate reasons including invoicing, analytics or even selling the data. This is a situation where you can completely eliminate your risk. If your company crafts a strategy to anonymize your data properly, it is no longer deemed PHI and no patient consent is required. (And if you need help with such a strategy, contact us.)
You can’t lose what you don’t have. If anonymized data were to be stolen, your organization has not committed a HIPAA violation. You’ll still probably want to figure out how the bad guys got into your network and patch those servers, but patient data won’t be exposed and you won’t have to send one of those credit monitoring letters to everyone you know.
Despite the hundreds of places which will let you take a quiz and print a certificate in exchange for money, there is no official HIPAA certification to be had. Training falls to organizations themselves, and implementation varies widely. Sometimes HIPAA training is baked into onboarding of new team members and revisited often; other times, training consists of a compliance officer asking a developer to sign a piece of paper. Health & Human Services has this to say:
The HIPAA Rules are flexible and scalable to accommodate the enormous range in types and sizes of entities that must comply with them. This means that there is no single standardized program that could appropriately train employees of all entities.
As developers, we strongly encourage anyone writing code to bake HIPAA into their app. This includes identifying and maintaining database tables that contain PHI, creating automated tests to ensure PHI isn’t leaked, and overcommunicating the reasoning for business logic related to safeguarding PHI via comments.
Lastly, developers should want to avoid risk themselves. As a technology lover, I don’t want a drone for the fear that I may cause an international incident. And as a developer, I don’t want credentials to production databases unless it’s absolutely necessary.
We’ll keep this theme going in a future post and discuss how we leverage hosting providers like Aptible, TrueVault, Datica, and Heroku Shield and compare them to the offerings of AWS, Azure, and Google Cloud to further reduce our risk.