Understanding, Underachievement, and the Impact of Conformity

Work smarter, not harder does not mean you can replace all your smart employees with tools and cheap button clickers. The fallacy is, tools are not a 1:1 replacement for people. You have the cost of the tool, the cost of the people who use the tool, and the cost of the people who install, maintain, and support the tool.

You can attract the right talent if you pay them what they are worth, treat them with respect, and enable them to support your organizational mission. Consider that as an alternative to silver bullet solutions with complex back ends and major licensing fees.

I have worked within IT for nearly twenty years, and after sacrificing more of my personal life than was reasonable to long hours and countless weekends spent in support of organizational missions, I decided to stay in information technology and join the information security discipline. I did this because I wanted make things better for the millions of people who daily entrust organizations with our financial and personal details either in naiveté (that we cannot be impacted by the loss of the information we share), or in the belief that these organizations are using the data we give them prudently and with due care to protect it. 

This isn't to say I am a card carrying member of the caped crusaders, but if you spend as much time as I do working, don't you want it to matter?

I didn’t get better hours out of the deal. But, in this industry, we tell ourselves this is worthwhile because we are working with those businesses to provide them with practical advice, products, or services that ultimately support the goal of protecting this data. Our goal is to work within requirements that are reasonable for businesses to operate effectively, but also to provide meaningful advice based on our understanding of topics that most others consider boring or arcane. 

When we tap on the keyboard and get access that we shouldn't have, it's easy for people to assume that we are wizards or miscreants. But the reality is, the knowledge we have came from hours upon hours of intense self-driven study. Yet, the lack of understanding about how we do what we do not only leads to organizations taking more risks than are strictly necessary during operation of their businesses, but it also causes people to believe that what we do is the same as a scanner or some other automated tool. Of course, almost any vendor will have you believe this, because programs are always cheaper than people. Or, are they?

It has only really even been recently that the concept of a "good" hacker can be leveraged successfully by legitimate businesses in a role that supports defense. And as our role in this microcosm evolves, so will the skills needed to be successful at it. The demands and use of this role will change, too; breaking computers is always going to be easier than fixing people. Up to this point, the role of a penetration test has been to test security controls to see if they are adequate for the role or configuration in which they have been deployed; we are the stones who sharpen the sword. Our role has been to demonstrate attack chains so that organizations can put personalized context onto the intelligence reports about compromises that paper the news outlets and conference buzz feeds. Our role has been, to put it in slang terms, to provide pics, or it didn't happen. 

But, more importantly, we are the experts who have the knowledge that unifies the narrowed mindsets created by hyper-specialization within IT shops worldwide. Where database administrators and systems administrators and network administrators may each see only their part of organizational architecture, the expertise of security professionals should provide the glue that enables an organization to examine architectural issues from a holistic view. Our real value to defense is to provide tactical advice driven by human creativity and sharply honed intellect about how to make systems better than they are.

Of course, not everyone wants to be better. There are many who want only to be good enough to avoid unwanted attention or public scorn. There are plenty for whom the arbitrary bar of compliance will always be more than sufficient, because there is no better bar to hold them accountable for that trust than the activity of criminals and the response of trustees who shackle themselves to less than adequate systems with chains of convenience. Some feel like they can't afford to do anything better, and if you prioritize tools over people, you're probably right. 
There are certainly those who over-react, too. People who, knowing not from experience but only from anecdote, will not be stilled until every cent is spent on complex, unnecessary solutions or processes that drive any hope of conducting commerce into the ground. For these people, there will be endless conspiracies until every system is unplugged and sealed away.

And perhaps we have failed in the ultimate goal of making a meaningful bridge for defenders. Maybe we have focused too much on knocking the robot down the stairs and saying, "See?" But, the message isn't "anyone can knock this robot down." The message is "the robot can be better." We've made leaps and bounds in making better security available for everyone. The knowledge is out there. Many vendors are listening, even cooperating, when before there was nothing. We shouldn't overlook that.

But, still, in sight of cost savings, organizations strive to replace skilled people with automation and rigidly defined checklists. And as they do, the utility of such a knowledge set is diluted. The less people need to understand in order to accomplish the high level objectives of a job, the less useful they become in these contexts. Running a tool does not make you qualified to understand a new system or the broader impacts of systems interactions. 
This mindset even discourages the type of learning that enabled people to be successful contributors in this part of the information security space. Worse, rigid process and blunt automation does not make us better. Automation does not drive innovation, it reinforces a status quo. Automation does not examine a system to identify the ways that system can improve; it examines the system according to a template of known action and identifies the way that system fails to conform. Not only does this cookie-cutter ideal of operation not improve security, but it fails to analyze security with the same unscrupulous eye as an attacker whose success is driven by exactly the opposite of this ideal: to be unexpected.

No comments: