Broaching the security and privacy implications of the data age
The next several days of Black Hat USA, DEF CON, BSides, and other great events kick off the 2017 edition of what's been lovably known for years as "Hacker Summer Camp." As a marketer, I'm the first to admit that the event name used to make me feel like a bit of an outcast. "Hey, I belong there, too!" (If you can imagine that in the smallest inside voice possible.) And, as a marketer who firmly believes in the mission of arming the defender, I absolutely do belong.
Heading into Hacker Summer Camp, I've seen a significant amount of public relations outreach. In my unique position as a ZDNet contributor and a chief marketing officer, I see both sides of this. We give the pitches, we get the pitches. The piece that has stood out to me is how flippantly, at times, marketers will treat research in their promotions. They almost seem to regard a race to be first as more important than revering the research. I'm the first person to want the big story -- the "air cover" as we call it -- and the brand development. However, that should never come at the cost of the research, or the researcher. Here's why:
- Research is at the core of our defense.
- Research is nontrivial and requires the most unique expertise.
- Research requires sacrifice.
- Research requires risk.
You may be thinking, "Oh, silly marketer, what do you know?" I know. I've been fortunate to work with high-caliber teams such as Fortinet FortiGuard, Sourcefire VRT, Cisco Talos, and the amazing talent at my current employer. I see these women and men work long hours to do the laborious job of helping their companies and clients build better defenses. It's not just about long hours, it's about a very unique balance of skills (marketers, stick with me):
Operational Security (OPSEC)
In order to break the things or dig into the depths of the criminal underworld, these researchers and analysts need to take extreme precautionary measures to protect their identities, their activities, and the companies on whose behalf they do work, are protected. This doesn't make the work easier for them, it makes it harder. There's a precision, a careful calculation, and often systems that need to be in place to help them achieve the required level of OPSEC.
Just as journalists need to, and should, protect sources, researchers need to protect their sources as well. It's trickier, however, than simply not naming someone, or naming the routes that helped them achieve the intelligence, or reverse engineering, that was necessary to help the defenders. It's protecting the good sources so they don't become targets of any malicious activity due to what they might have unveiled -- the part similar to journalism.
But it's also about protecting sources so that, if they have found a hack or a path to obtaining information that helps the greater good, they don't lose it, or worse, compromise OPSEC for anyone involved. At some point, of course, if research is publicized, there's a light shown upon the one who did the work, but that's where Responsible Disclosure comes in to protect certain sources, and other assurances are put in place to ensure that.
Remember the term guilt by association, or -- heaven forbid -- death by association. (We've all seen Goodfellas, right?) What happens to the innocents that stand by not only the good folks, but even the innocents who stand on the right of the law? They become vulnerable.
Any time a researcher or analyst takes on a cyber-criminal, he or she is putting their well being, and the well being of their close family and friends, potentially at risk. If you need proof, look to see what has happened to high-profile defenders and/or their families in terms of doxxing or swatting when the defender has actually done research that negatively impacts criminal gains. Their risk also increases with nation-state responses to reporting when valuable research exposes their cyber activities.
I wish I could do what security researchers could do; ask any one of my best friends who can school me technologically on their worst days. It's why I found a way years ago to apply my own unique and weird skillset to helping build companies that create jobs and opportunities for the talent who are researchers and analysts. Their skills are not easy to come by. Much like any discipline, doing the studies or taking a hacking course doesn't mean you can apply it in the real world. That's why red teaming and adversarial engineering are so critical in determining where our best defenses originate.
These were 2017's biggest hacks, leaks, and data breaches
Collaboration with law enforcement, the constant change of the criminal underground and potential threat vectors, the current geopolitical landscape, and even some of the limited sharing and collaboration mechanisms create additional challenges for researchers.
OK, marketers, this is why you should care. Everyone knows us as the snake oil salespeople who spin straw into gold in order to build a castle (we also mix metaphors). We are allegedly in it for the money, the glory, the news clips, the attention, and the internal praise of our executives.
That's wrong, and if it's not wrong, you are doing it wrong. I don't know that this is unique to security, but I'm biased after 17 years; you cannot effectively market what you don't understand. That means you must understand how research works, how the community operates, how the subversive culture applies to defensive wins, and how the industry is truly built. If you're doing it for the wrong reasons -- aka, stereotypical marketing goals -- then you're not in the right industry.
I see this evidenced when, I mentioned earlier, these atrocities happen:
- Accuracy is compromised in order to get a story out faster to "beat" the competition.
- Lightweight marketing content is pushed out as "research" and only hits on topical levels.
- FUD is spread. Do not scare people into writing, downloading, reading, or taking a meeting. A scary story helps no one. Related: Don't even think about pushing against Responsible Disclosure for the benefit of marketing. It will backfire -- as it should.
- Embargo rules are not honored. Think about it, marketers. Knowing what you now know, you're a research team that has done all of the above to create help defenders, and you send an unsolicited embargoed report to a journalist you don't know. You've just cheapened the research, risked the confidentiality of the research, and worse, you've forgotten that this research exists to protect people; marketing is just a nice secondary.
- Researchers are not cited. Make sure their name is on their findings. Unless they don't want to be public due to their own OPSEC and risk concerns, they should always be credited with the hard work they have done and continue to do. It's not just about researcher ego (though, let's admit it, it exists). It's about showing appreciation for their hard work, revering the researcher, and realizing that as marketers in some way we are responsible for these people's brands and the level of awareness they get. Do not diminish that. Especially for the sake of a campaign or story.
Again, I am a chief marketing officer, as well as a ZDNet contributor. I know the pressure of demand generation, conversion metrics, lead qualification, and air cover counts as much as anyone else. But you build a business not by being first, not by being loud, not by sending out FUD or selling out researchers to get a faster headline. You do it by showing your company's, or clients', expertise in a way that best represents them, with marketing content reviewed by them, with expert input from start to finish. That's what builds a strong security company.
That's how you revere the research.
See you in Vegas.