X
Tech

Why we need to stop cutting down security's tall poppies

We need to have a lower tolerance for lax security, but we also need to encourage those that are actually trying to do the right thing.
Written by Michael Lee, Contributor

More often than not, the information security industry tends to focus on the negative. It's hard not to. If we're not pointing out our numerous breaches and outdated systems, it's about how we're getting owned by month-old vulnerabilities rather than that flashy zero-day or advanced persistent threat.

And it might sound like the noble and "right" thing to say when we apologise, that we can do better, and accept that we made stupid, avoidable mistakes, I think a lot of information security professionals are quite often simply too hard on themselves and on their colleagues. Why? Because when security is done right, you rarely ever hear about it.

Do we ever hear the CISO being congratulated that yet another week has gone by and the company didn't make it to the headlines for some oversight? Do we ever hear the guys on the ground congratulated that they picked up a bug in their software and managed to close it before any one of the thousands of attackers probing their system discovered it? Do we ever read, in the company's annual report, how the actions of an response team saved however many millions of dollars in lost revenue, reputation damage, non-compliance fines, or potential investigations by privacy regulators?

Of course we don't — it's easier to dismiss these things as the normal functions of their job.

I wouldn't have so much of a problem with that — every industry has its unsung heroes — but it's when on top of the existing work they're not acknowledged for, we want, or even demand, that information security professionals go even further

PayPal, for example, declined to pay up via its bug bounty scheme for 17 year old Robert Kugler. Kugler discovered a cross-site scripting vulnerability on PayPal's site, but because he hadn't yet turned 18, he was disqualified from participating. My initial reaction was one of disbelief. Was he meant to wait until he turned the right age and leave it vulnerable until then? And where did it say in PayPal's conditions that he needed to be of a certain age? What does that say about younger hackers? That they're not good enough?

But as much as I thought PayPal's actions were dumb, I couldn't deny the fact that they are doing a hell of a lot more than some other companies. And when PayPal wrote back to the young hacker and told him, in what felt like a veiled insult, that he had not practised "responsible disclosure" I was angry, but had to slowly force myself to agree.

I understand that it's a frustrating experience knowing of a vulnerability that it should be seemingly trivial to fix, and yet nothing happens, but I've also been on the side of writing up code and seeing how small changes can have significant ripple effects across a project. For example, we face-palm when we don't see the use of prepared statements to avoid SQL injection attacks, but actually going through legacy code and re-writing everything can be a nightmare. It takes just a couple of minutes to run a tool like Acunetix against a site, but it can take days to trawl through code — often times not even your own.

That's why I don't think it's fair when we expect companies to fix problems within hours of it being reported and demand engineers and developers to do better. We often do so without giving any consideration into what happens behind the scenes, or of the tough decisions to completely shut down a service.

And caught up in the moment of discovering something they've missed, it's easy to think we know better. Not everyone does it, but it's all too easy to ride the I-told-you-so train of righteousness. That's how we end up with hacktivist groups whose intent is to demonstrate how weak everyone's systems are. The intent is positive, but often times, the vulnerabilities disclosed aren't being actively exploited and we end up with a bigger mess than we started with.

Where does this leave us, then? Do we let companies hide behind the concept of responsible disclosure? Not at all. Disclosure can be an important tool in motivating a company to do something, but we need to be more realistic in the time frames given to fixing problems, for certain types of vulnerabilities, on both sides of the fence.

The argument isn't for or against full or so-called responsible disclosure, it's about being smart about what is disclosed. Irresponsible bug hunters tend not to think far ahead enough and consider just how badly the information might be abused. After Google said that we should reduce our definition of what is considered a reasonable disclosure period to seven days, I worry that most bug reporters will see this as Google's blessing for barely-delayed full disclosure on anything they see.

However, Google's wording was very deliberate when they qualified their call for the seven day period, stating that it is appropriate for critical vulnerabilities "under active exploitation". I certainly agree that 60 to 90 day disclosure blackouts are antiquated these days, but we need to be careful not to prematurely disclose vulnerabilities that, as sad as it is, were protected by some level of security through obscurity. The seven day period really applies to vulnerabilities that are already being so widely exploited that the benefits of letting everyone else know outweigh the negatives of adding it to every hacker's toolbox.

If we're not careful about limiting reporting to those vulnerabilities, we're simply creating more work for our already overtaxed information security professionals. Instead of them discovering or fixing key, critical vulnerabilities, they're chasing bugs that are being exploited only because they're the hottest thing that someone leaked. It's like the psychiatrist who, although he's attending to someone who has their own concerns, has to stop what he's doing to deal with the frustrated client shooting everyone in the waiting room to prove he has issues.

As for PayPal, it had to address this issue while treading very carefully. I was one of the many that misplaced their anger, and there are probably countless out there that didn't pick up on PayPal's message that part of better disclosure is actually about not creating undue work for their security team, while also ensuring that people like Kugler get the recognition they deserve. It wasn't about telling Kugler he did the wrong thing and that kids who don't play by the rules aren't welcome — it was about showing him what the best way of helping security teams like theirs would have been, and politely suggesting whoever he might help out in the future might also appreciate the responsible disclosure route.

But as it turns out, PayPal is well within its right to refuse a payment because even though their terms and conditions are sketchy, it's covered. It's true that PayPal doesn't clearly state on its bug bounty program that those younger than 18 aren't eligible, but it does clearly state that a verified PayPal account must be used to accept payment.

This is important because PayPal's general terms and conditions state that account holders must be 18 years of age or older. Given this, Kugler would have been unable to accept a payment. It also turns out that PayPal apparently already knew about the bug from a previous reporter. It states in its bounty program that it only awards the first person that reports the vulnerability, making the age debate a non-issue.

Could PayPal have done more? Sure. And it acknowledged that it could have. In its letter back to Kugler it promised him an official letter of recognition from its CISO — a first for its program — thanked him, and even offered to give him a call to have a chat. Anyone else that would have come second in a bug bounty could have been rightfully told they were too late.

A part of me did feel dejected that this is the best PayPal could muster, but when I consider what PayPal could have done — told the kid to go take a hike — I realised that it's yet another case of so many of us being far too harsh on a company that is trying.

The reality is, there are few companies that have their security at a level that would please the most demanding of us, and fewer still that have bug bounty programs. Instead of pointing out the flaws in those that are trying, though, we should be encouraging others to listen to more bug reporters.

We're not going to see others rise to the challenge of rewarding responsible disclosure if the only thing we do is point out the flaws in existing programs. And sad as it may be, a flawed program is better than none at all.

Editorial standards