As businesses face large cyber breaches with increasing regularity should we be gearing up for an ultimate breach? Is it really possible for someone to get inside the underlying infrastructure of major cloud providers such as Amazon AWS or Azure and start exfiltrating data out of their storage arrays?
I believe these are all serious questions we should be asking ourselves. We are putting a huge amount of trust in these 3rd party vendors. It is clear they are doing a large amount to protect our security but it is clear that in the current climate no company, no matter how big or powerful is impervious to a large scale cyber breach.
Things to remember about The Cloud
- Who owns the keys to the kingdom including API access keys, SSH private keys, KMS keys and to which resources? Where are they able to connect from internally in the office and where can they access remotely? Do you regularly use contractors who come and go in development teams and is someone closely managing access control? If you are on unsure of your organisation’s position in relation to any of these questions then it’s crucial to find out the answer. We live in a world where projects are running under strict deadlines, contractors come and go on weekly or monthly basis and it’s getting increasingly difficult to keep track of everything. It’s vital to restrict access to not only passwords and API keys but the resources and infrastructure our developers or sys admins are accessing them from adopting the principle of least privileged. Strict processes need to be in place staying on top and limiting who has access to our cloud infrastructure.
- It’s easy to forget some of the security practices we traditionally think about when we used to work with bare metal now that the top cloud providers are providing up to date virtual machines. From a security perspective, it sometimes looks like people assume that because data is sat on a top tier provider’s cloud infrastructure it will be secure. Thought needs to be put into many areas like network segregation, host based monitoring, security information and event management platforms and PowerShell logging to capture in memory malware not hitting the disk.
- What resources have we span up and how many VM’s are currently running? What are we storing in our massive storage arrays? In large cloud environments, a company could witness VM’s spun up on a daily basis as test platforms and development environments. Servers that are public facing and easy pray for an attacker can be left open with poor or default configuration. It might only be a development environment but it’s easy in the speed of everything to fumble firewall rules and network segregation possibly resulting in an attacker being able to pivot from within the development network into the production network.
While working in cyber security as a penetration tester it opens your eyes to events that previously appeared feasibly impossible. It is possible for people to infiltrate banks, national infrastructure and large-scale businesses including blue-chip tech companies and defence contractors. Security problems are everywhere these days, often due to insufficient resource or projects being pushed to strict deadlines with security taken as a low priority. The larger the company, the larger the attack surface which as penetration testers makes it easier to get in and find problems but in turn, makes it easy for malicious threat actors to gain access and compromise companies or exfiltrate data.
We need to constantly be keeping security in the front of our minds with regards to any IT projects no matter how big or small as the more data is consolidated and moved to cloud platforms, the more chance a big breach will occur. It is going to happen, so make sure it’s not your company next on the front of the newspapers leaking millions of records and company intellectual property.
How Nettitude can help
Nettitude can help your business stay safe with specialised cloud service testing. Contact us today for a free consultation.