Charting Stacks

Security, Accountability and Compliance as Code

Share via Twitter Share via Facebook Share via Linkedin Share via Reddit

Over the last few days we have seen a flurry of coverage here in the UK around a hack (or more accurately a sql injection, that appears to have been carried out by a fifteen year old) on a large ISP, TalkTalk. Now this is not the first hack that has impacted TalkTalk this year, but it rapidly became the most high profile due to the possibility of personal and financial data being taken.

While we have had some wonderfully scaremongering headlines, which have subsequently morphed into something slightly more sensible :-

 

we have also had a number of business leaders and organisations immediately commenting on what should be done.

Among the first organisations out of the gate to call for more government intervention was the Institute of Directors, using the catchall phrase of “cyber-crime”.

Outsourcing and Accountability

The comments from the Institute of Directors lead us pretty quickly to the issue of technical literacy at board level. As James has recently pointed out the era of big outsourcing has come to a close. However, we have a legacy of technology being treated at board level as a cost rather than a strategic investment. This, of course, does not match any of the trends we are seeing across multiple industries that are being disrupted by new technologies and the innovative way developers are bringing these technologies together.

I recently gave a talk on emerging technologies to a group of CIOs, all of who run large technology estates. One of the questions I asked was how would the rate their own boards estimate of the boards level of technological understanding, and what they reality was. Without going into the precise details it is suffice to say that on a scale of 1 to 10 we barely broke out of the lowest quartile on the reality question.

One way to ensure businesses take security risks, and technology risks in general, seriously is to ensure accountability rests at the correct point in an organisation. Nothing ensures a focus on doing the right thing more than making the board responsible. Perhaps the Institute of Directors should lobby for boards to be responsible for signing off on a compliance report that a minimum standard of security is in place on say a quarterly basis?

Now while I am loath to suggest something that may lead to yet another ITIL type standard (there are few things that can generate as many groans among a technologically literate audience as mentioning ITIL), there is a need for consumers and businesses to have a level of trust and confidence in the companies they do business with. Instances such as the TalkTalk hack erode this.

Security

Security is a huge subject, and, to be honest, one I really do prefer to avoid. There are many far better informed commentators than I. However as a consumer, there are some basic bits of security hygiene that we should all expect. As an engineer I find not having these minimum, and easy to implement, standards in place to be unacceptable.

From not storing passwords in plain text and safely storing personal and financial details, to ensuring operating systems and applications are running at appropriate levels, to running penetration tests as part of your CI system these are all simple, low cost, high gain areas for any company to implement.

Will implementing these basics eliminate attacks? Not at all, but they will raise the barriers substantially.

Compliance as Code

The idea of compliance as code seems obvious to anyone familiar with DevOps, and it is an area that is gaining a lot of focus in recent times, with luminaries such as Gene Kim giving some excellent talks on the how and why of compliance as code.

The reality is that compliance is not going to get any easier in the future. We have multiple pieces of legislation working through legislative bodies across the globe, and as consumer awareness of privacy and security continue to grow we can expect more and more requirements to emerge.

The good news is that multiple vendors are working on this problem and creating tools that are either already familiar or easily fit into a devops workflow, from configuration management tools such as Chef, Puppet and Ansible, to tools like AirWatch from VMWare securing devices on the edge.

Now that the tools are available, and being actively developed, the question we come back to is asking boards if they want to invest in their staff. The only way to implement an approach such as compliance as code is to automate it. At scale. To do this companies need to invest in their teams and ensure the skills are available in house.

More broadly, however, there is an onus on executives at board level to understand the changes necessary at an organisational level to get the best out of technology. Blindly outsourcing everything and accumulating technical debt was always a bad idea, and continuing to do so will only compound the problems.

Disclosure: Chef, VMWare, Ansible and RedHat (who are acquiring Ansible) are all current RedMonk clients.

No Comments

Leave a Reply

Your email address will not be published. Required fields are marked *