Security Expert Explains California LawSecurity Expert Explains California Law

Companies face stiff fines if they fail to inform customers that confidential information may have been seen by hackers.

George V. Hulme, Contributor

June 23, 2003

9 Min Read
information logo in a gray background | information

California's new security-breach disclosure law, SB 1386, goes into effect July 1 and could forever change the way corporations handle security breaches of customer information. The law has been widely discussed in legal circles for nearly a year. But there's been much confusion as to what the law actually requires companies to do. And it appears that many companies haven't taken steps to prepare for those law's requirements. information senior editor George V. Hulme interviewed Mark Rasch, former head of the United States Justice Department's computer crimes unit and current senior VP and head of cyberlaw at the managed security services firm Solutionary Inc. about the law, its potential impact, and how companies can comply with SB 1386.

information: What constitutes a security breach in this law?

Rasch: If there has been a compromise to a system that has this kind of information, you have to assume that the information was accessed, even if there is a good chance that it wasn't.

information: Could a common Bugbear virus or worm that triggers a Trojan or keystroke logger be enough to force disclosure to customers?

Rasch: Not necessarily. It has to be the kind of compromise that could reveal the [personal] information. The problem with a Bugbear kind of compromise, of course, is that it could open up a back door. And there would be no way of easily knowing if the back door was opened that could eventually reveal the information. But if you get a Bugbear infection and you can validate that the back door had not been exploited, then it would probably not be a reportable event, unless the Trojan itself gathered data and transmitted it.

information: Will the law require more detailed forensics analysis of not only hacker and employee breaches, but also of virus and worm infections?

Rasch: Exactly, and in some cases [it will require] an independent forensic analysis. The problem is that most companies don't have the tools and the facilities to do that kind of forensics examination. You really need to be able to dissect the worm and study its infection. You need to look at telephone logs sometimes because they [Trojans, malware] can open up a dial-up port. There are lots of things that need to be done just so you can validate that you don't have to report the incident.

Another vagary of the law is that it doesn't distinguish between internal and external unauthorized access. You typically think of unauthorized access as a hacker coming in and stealing personal data. But it could be a guy from accounting who isn't allowed to look at credit-card data and does. That fact itself triggers a reporting requirement.

information: Is there any accounting for intent in the law? Or is just the fact that someone who isn't authorized saw this type of information enough to make it reportable?

Rasch: Technically, yes. But it's even worse than that. It's like the guy from accounting wanders into the room where the data is held, even that may be enough to trigger a reporting requirement. That may sound absurd, but it does make some sense. How do we know that the guy from accounting didn't read all of that stuff and intend to commit identity theft or fraud? We don't. And the only way to guard against it is to inform the individuals whose information may have been breached so they protect themselves and look more carefully at credit reports and applications for credit.

information: Can companies avoid all of this by encrypting this information?

Rasch: Actually, the way to avoid this is to increase overall security. Forget about encryption. Encryption is just one form of security. If it's encrypted, and the guy breaks the crypto, then it's no longer encrypted. It also has to do with the form of encryption as well. You can have a form of encryption that is protecting the file directory but not every bit of information on it. And, encryption is only as strong as the [password] protecting the key. So encryption isn't necessarily a savior. But look, if your data is encrypted, and encrypted with strong encryption, you're in pretty good shape.

information: What if you just encrypt names, first and last?

Rasch: Then it's no longer personally identifiable and doesn't have to be reported. The guy [snoop] got junk. But we also know of all of these circumstances where people have taken encrypted password files and just brute-forced them open and were able to convert them into personally identifiable information.

information: A survey just came out that said 40% of financial firms experienced a breach last year. Typically, financial-services firms are considered leaders when it comes to information security. Wouldn't this mean that there could be a lot of reporting next year?

Rasch: That's exactly right. But there are all kinds of ways to try and get away from this statute. You can try to not monitor. "If we don't know, we won't have to report it," a company might say. Well, you do so at your own peril. I think willful blindness is hardly an excuse for lack of reporting. And consumers expect this. Think about this from an intellectual standpoint, as to why this statute exists. You have no way of knowing that you're going to be a victim of identity fraud or theft until its way too late. So you want an early waning mechanism. And that's what this law attempts to provide customers.

I think the good news is this statute, in and of itself, will force companies to do something about security. The easiest, cheapest, and most effective way to deal with this is to avoid the compromise in the first place. And for that, you really need to manage your vulnerabilities, fix them, and monitor your network continuously for exploits.

I had a case about a year ago. An online merchant had its computer network hacked, and the hacker got root [access] on that network. That network was connected to another network that contained credit-card data for thousands of customers. Because the guy had root [access], he could have gone from one network to the other. There was no evidence that he did. The evidence tended to indicate that he didn't. What do you do as a responsible and reasonable institution? Aside from fixing the problem and gathering the data for a forensics analysis and prosecution, what do you do with respect to your customers? Do you automatically assume that the data has been compromised and automatically generate new credit cards for the customers? Or, do you notify them without taking the remedial effort? Or, do you take remedial steps without notifying your customers?

Now, under this law [SB 1386], if you assume the data hasn't been comprised, and you're wrong, you are going to be in trouble under this statute.

In this case, we went to various credit and reporting fraud agencies and gave them the numbers and had them monitored for unusual activity. If there was any unusually activity we would then notify the customers. We didn't want to go through massive reporting if there was nothing to report. Under this law, that approach is not an option.

And ask yourself this question: What would you do if you got such a notification? At a minimum you'd want a credit report every three months and a new credit card. And you'd want to be put on a fraud watch list. Who's going to pay for all of that?

information: How about penalties? Can the civil penalties be severe? There doesn't seem to be a ceiling.

Rasch: No, there doesn't, and the penalties can be severe. If I'm a lawyer representing a client in this situation, I'm not going to say don't worry because the penalties aren't that high, and that it's safe to just ignore the statute and pay the fine. Because the truth is, you're not only going to have to pay the fine, but you're going to have to disclose anyway. So then you're in the situation of not having protected your customer's data, you haven't told customers about the breach when you were required to do so, and you had to pay fine. So where's your fiduciary responsibility to your shareholders and your customers? That's not a great situation to be in. So your best bet is to get secure.

information: This is going to be a boon to forensic analysis, isn't it?

Rasch: Sure. Because one of the advantages of forensics is that it can prove that the data wasn't compromised and that no reporting is required. And this will also be a boon to encryption. This is all stuff people ought to be doing anyway.

information: I imagine you just can't get through on ROT13 [very weak encryption] and say you're fine?

Rasch: Right, and that raises the question as to what is encrypted. Even encrypted access doesn't mean the files are encrypted. You have to go back and look at the method of encryption. The fallacy of encryption is that encrypted data at some point in its life cycle is going to be unencrypted. And if the compromise occurs at a point when the data is unencrypted then you have a reporting requirement.

information: Do you expect a flurry of reporting next year? What do you expect will happen in the first year as a result of this law?

Rasch: I expect initially companies will ignore this. But a number of things will happen. In the obvious cases, the big breaches like the one against CDUniverse, when there is a direct and obvious attack, you will see reporting. You will also see some companies pay more attention to securing this kind of data. You will see people challenge the statute. They're going to argue that the Internet is both national and transnational, and therefore no one state has the right to impose regulations for the entire nation. The question is, if this is the kind of thing that requires a uniform national scheme, then no one state can regulate it. Think about if California tried to regulate the gauge of railroad tracks. Well, that's unconstitutional, because you can't have every state setting the gauge of railroad tracks.

Return to main story, California's New Rules Of Disclosure

Read more about:

20032003

About the Author

George V. Hulme

Contributor

An award winning writer and journalist, for more than 20 years George Hulme has written about business, technology, and IT security topics. He currently freelances for a wide range of publications, and is security blogger at information.com.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights