What Software Companies Don’t Want You to Know About Your Data Security and Liability – Chapter 13
What Software Companies Don’t Want You to Know About Your Data Security and Liability
8 Secrets of Software Companies and the Truth You Need to Know
I have spent close to 14 years building cloud-based software for doctors. I was a pioneer in that area. Patient data security was always at the top of my list of concerns.
Many software companies have been spreading misinformation about data security and your level of exposure. I stop short of calling them lies because based on what I hear software companies saying, it is probably more a lack of knowledge and experience.
As a doctor, that bothers me. What if I had listened to them and then realized later how much they were actually costing me?
Here are eight facts software companies do not want you to know:
- There are two types of systems
There are basically two types of software systems: client server and cloud- or Web-based.
- Client server means the server and data are stored in the doctor’s office. Then other computers in that office connect to that internal server. All the computers and the server have to have software installed on them. The software needs to be updated on a regular basis. Examples are ChiroTouch and Platinum.
- Cloud- or Web-based means the server and data are stored in the cloud—or more accurately, stored on a server that is in a data center connected to the doctor’s practice by the Internet. The software is also stored on those servers. You can think of it like the online version of QuickBooks. Genesis is a cloud-based product.
- You still own your data if it is stored in the cloud
Here is where the misinformation starts. Client server software companies have been telling doctors that if their data are on a cloud server, they don’t own them. There’s no other way to say it—it is a big fat lie. You always own your own data. It doesn’t matter where the server is.
- You can access your data if you switch software companies
Of course you can. Client server companies have been telling clients just the opposite for years: “If you ever leave that company, you can’t access your data again.” It is a scare tactic—misinformation—for several reasons. First, if a company ever held your patient data and would not give you access to it, it would be illegal. By law, cloud-based systems must store PHI (protected health information) for seven years (or whatever the legal requirement is for your state). Your ability to access data if and when you switch software companies is actually much better in the cloud. We’ll look at that in more detail later.
If you are going to go with a cloud-based solution, you should make sure the company has extensive experience and a long track record. Do not choose a cloud-based company that just happens to pop up or a client server company that suddenly decides to build a cloud-based version of their product. Even though they seem similar, they are very different. I can tell you this based on 15 years of experience with cloud-based technology.
- A cloud-based company cannot hold your data hostage if you leave
Maybe software companies are unaware of this, or maybe it’s another case of misinformation. Or maybe they have no clue about running a business. But I have my own opinion. There are legal and contractual protections against exactly this. From that perspective, your data are more than safe should you decide to go with a cloud solution, assuming it spells that out in your agreement with them, which it should.
The truth is that we are all in business, so let’s think about this pragmatically as well. Imagine what would happen from a PR standpoint if a cloud-based system withheld access to a former client’s patient records. It just doesn’t make sense. In the age of Twitter, Facebook, and other social media outlets, withholding access to a client’s data for no real reason, legal or not, would be just plain stupid. Most cloud-based systems have a clause in their contracts that covers former clients who need to gain access to patient files.
Again, consider the alternative. You buy a new client server system. You use it for a few years. You decide to go in another direction. Maybe you choose to move to the cloud. Five years later, a patient has a legal case unrelated to your practice and requests records that were on your old client server system from seven years ago. By law, you are required to provide them.
You go into the dark recesses of your office where your old server is. Hopefully, you still have a computer connected to the server. In any case, you haven’t fired up either of those babies in five years. Who are you going to call? How will you get the records? What if the server doesn’t even turn on?
If you don’t have a computer hooked up to that computer, you’ll need to do so. Will a new computer be compatible? In any case, it will need to have the software installed on it. If you don’t have the software anymore, do you think that old software company will actually give you a license? What if they were bought out in the meantime? (By the way, there’s a reason all those client server systems are getting bought out.)
- My data are safest on the cloud
PHI data are some of the most valuable data on the black market. This is the question you should be asking: Where will a hacker most likely try to get that data? You might think it makes sense for them to go to a large data center where the most data are stored. But here is the correct answer: They will go where it is easiest to get the data.
- We know the hardest place for a hacker to try to get data
My software is cloud-based, so our data are stored in a HIPAA-compliant data center similar to the data centers that store Wall Street information. The data center’s security system requires biometric scanning just to enter the building. The power source to the center has diesel generator backups in case of a catastrophe. The data centers are among the first to receive diesel gas, even when there is a shortage and even before the gas stations get it. There is 24/7 security on site. Data security is their business, so the data center has the latest firewall protection measures in place and is constantly updating them. It’s like a Fort Knox for data. The connection from the doctor’s office to the data center has the latest banking-level encryption required by law. Every keystroke is protected.
If you were a hacker, would that be the place you would go? Consider the alternative.
We have talked to doctors who were told that keeping their data in their own office was safer. Their office network does not likely have firewalls, and it is probably not updated on a regular basis. There are many holes in the system that a hacker could penetrate. For example, many of these systems tout online patient forms that send intake forms to the software. The problem is that it also leaves a big fat hole that a hacker can penetrate. If a software company finds a vulnerability in its system, how do they deploy a fix to protect your data? The only way is through a software update that would have to be performed manually at your office. Would they really be able to reach out to thousands of practices and make sure it is performed correctly? If I were a hacker, I would do a Google search for physicians in any given area and start hacking. They are the weakest, most vulnerable link.
- You are liable if your data are stolen
You bet. Big time. If your data are stolen because of negligence (e.g., using software like one of these client server systems), the fines are all yours. That software company has zero liability. Even if they were liable, I would bet they have insurance against those types of claims. They will never feel it. But it will put you out of business.
One the other hand, with a cloud-based system, you have basically outsourced the liability since the system is entirely contained and HIPAA-compliant. If the data center gets hacked, you will most likely have zero liability. Cloud-based software companies typically carry hefty data security insurance policies.
What will it cost you if your data are stolen?
The fines are considerable. Remember, each patient record that is compromised, even if that patient has not been in your office for some time, counts as one occurrence. It is also based on per occurrence and per year you’ve had that patient record. So if you have a patient’s data on file and you haven’t seen that patient in seven years, that would count as seven violations. The minimum fine is $100, and the maximum fine is $50,000 for each violation (see details below).
There are 4 categories
CE stands for covered entity, which could be your office in the following cases:
Category 1: A violation that the CE was unaware of and could not have realistically avoided had a reasonable amount of care been taken to abide by HIPAA Rules
Category 2: A violation that the CE should have been aware of but could not have avoided even with a reasonable amount of care (but falling short of willful neglect of HIPAA Rules)
Category 3: A violation suffered as a direct result of willful neglect of HIPAA Rules in cases where an attempt has been made to correct the violation
Category 4: A violation of HIPAA Rules constituting willful neglect where no attempt has been made to correct the violation
Not sure which category these examples fall under? That’s a great point. Guess what? You’ll have to pay a lawyer just to figure that out and argue that point.
Category 1: Minimum fine of $100 per violation up to $50,000
Category 2: Minimum fine of $1,000 per violation up to $50,000
Category 3: Minimum fine of $10,000 per violation up to $50,000
Category 4: Minimum fine of $50,000 per violation
Potential jail time
Tier 1: Reasonable cause or no knowledge of violation – up to 1 year in jail
Tier 2: Obtaining PHI under false pretenses – up to 5 years in jail
Tier 3: Obtaining PHI for personal gain or with malicious intent – up to 10 years in jail
- The government will enforce these laws
There is a major misconception about this. In the early years of HIPAA, the government did not effectively enforce many HIPAA violations. It was a typical example of the government coming up with a great law but forgetting it would be only as good as their ability to enforce it. So they didn’t for a while.
With the economic downturn and the lack of revenue, the government started getting creative. That, combined with the rise in data security awareness got the government’s attention. Who better to recapture revenue from than rich doctors? The Obama administration hired private parties to find violations. These HIPAA mercenaries are paid a percentage of the penalty received by the government. The tiers and categories were signed into law in 2009 by president Obama as part of the American Recovery and Reinvestment Act. That was in the very early days of his administration and may have been the first bill he signed.
Yes, the government will enforce these laws.
- Do you own your data if it is in the cloud? Yes, always.
- Do you have access to your data in the cloud? Yes, always.
- Is your data safer in the cloud? Yes, much safer.
- Do you have more liability in the cloud? No, much less.
What Should You Do?
- In my opinion, every practice owner should contact a HIPAA compliance attorney and request a referral to an insurance carrier that will cover you in case of a data breach.
- Hire a data security company with experience in HIPAA compliance. Put a plan in place to fix issues you may have. That will go a long way to protect you from past and future violations.
- Move to a more secure system in the cloud. Again, I am biased here, but there is a reason the biggest software providers are moving to the cloud as data security becomes a bigger concern.
Disclaimer: I am not an attorney, and this book should
not be considered legal advice in any way.
Always consult with your attorney for
legal advice on these matters.