I would propose this working thesis; no security breach happens without a change or a need for a change! Do you unequivocally know what has changed in your IT Footprint? Let’s start here think about your holistic IT environment starting from the physical aspects all the way to your applications and most importantly your data. Now ask yourself just how confident are you that all changes are known and are being monitored and detected. If you are like the numbers from research around these kinds of controls then you probably don’t since approximately 87% of organizations don’t have a “closed loop” process with detective controls to manage change processes. The overarching concept here is that if you don’t have integrity in your systems and their configurations you don’t have a secure environment and if you don’t have a secure environment you cannot have trusted computing.
What is too often forgotten when considering a holistic approach to cybersecurity is that processes and controls matter. Too often best practices in IT Management is seen as bureaucratic inefficiencies that are working to block ever increasing demands of IT to produce results faster, solve business problems quicker, release software and updates immediately and so on. This has come to be known as “DevOps” the idea of marrying development to operations in order to drive efficiencies and release things faster. While I am certainly not proposing this is impossible, I do see the effect of failed changes causing a myriad of issues and one of them being the threat to the “integrity” of a particular IT footprint be it software, hardware, or other. Managing Change is critical to knowing the integrity of your IT systems, and one cannot effectively manage change without considering how things are released into IT systems/environment’s and one cannot do effective change and release management without knowing exact configuration of an IT asset. In order to achieve ongoing Integrity within your IT systems you must following configuration, change, and release processes. Without “integrity” you cannot have true security and without security you cannot achieve “Trusted Computing”. So the trite answer to my rhetorical question in this articles title is simple to answer – “YES”.