What IT Pros Need to Know About Multi-Cloud Security
Brought to you by ITPro
Having workloads distributed across multiple clouds and on-premises is the reality for most enterprise IT today. According to research by Enterprise Strategy Group, 75 percent of current public cloud infrastructure customers use multiple cloud service providers. A multi-cloud approach has a range of benefits, but it also presents significant challenges when it comes to security.
Security in a multi-cloud world looks a lot different than the days of securing virtual machines, HashiCorp co-founder and co-CTO Armon Dadgar said in an interview with ITPro.
“Our view of security is it needs a new approach from what we’re used to,” he said. “Traditionally, if we go back to the VM world, the approach was sort of what we call a castle and moat. You have your four walls of your data center, there’s a single ingress or egress point, and that’s where we’re going to stack all of our security middleware.”
At this point it was assumed that the internal network was a high-trust environment, and that inside of those four walls, everything was safe. “The problem with that assumption is we got sort of sloppy,” Dadgar said, storing customer data in plaintext and having “database credentials just strewn about everywhere.”
Of course, IT pros can no longer assume that this is the case, and must take a different approach, particularly in a multi-cloud environment.
Cloud connectors, APIs create more entry points for hackers
“Now many of these organizations don’t have one data center. They don’t even have one cloud,” he said. “They may be spanning multiple clouds and within each cloud they have multiple regions, and all of these things are connected through a complex series of VPN tunnels or direct connects where the data centers are connected together on fiber lines, those things are probably tied back to their corporate HQ and the VPN back there. It’s truly a complex network topology where traffic can sort of come from anywhere.”
Dadgar is one of the founders of HashiCorp, which launched in 2012 with the goal of revolutionizing data center management. Its range of tools – which the company has open sourced – manage physical machines and virtual machines, Windows, and Linux, SaaS and IaaS, according to its website. One of these tools, called Vault, “secures, stores, and tightly controls access to tokens, passwords, certificates, API keys, and other secrets in modern computing.”
Dadgar sees Vault as one of the newer tools that security pros are looking to in place of middleware, but it’s not just technology that needs to change in a multi-cloud environment.
“Security practitioners are trying to figure out how to bring security to Wild West situation,” Dadgar said, noting that the approach from the security professional has changed as they need to work closely with developers and operators.
“Security people are being pulled more intimately into application delivery process as well as having to totally recast the set of tools they use, and take more of a service provider approach as opposed to a sort of invisible hand,” he said. “Security has to have a seat at the table, developers and operators have to be aware of it, and there’s a necessary tooling change.”
These changes include ensuring that data is encrypted both at rest and in transit, and taking a hygienic approach to secret management, he said.
“One of the things that kind of protected us in the old world was it was a lot more obvious when you were making a mistake when you physically had to rack and stack servers and move cables around,” Dadgar said. “Now that we’re in the cloud world and everything is an API, it’s not so obvious what’s happening. If I make a slight change to configuration it’s not necessarily obvious that this is bypassing my firewall.”
One example of this is the recent OneLogin security breach where customer data was compromised and hackers were able to decrypt encrypted data. OneLogin, a provider of single sign-on and identity management for cloud-based applications based in San Francisco, said “a threat actor obtained access to a set of AWS keys and used them to access the AWS API from an intermediate host with another, smaller service provider in the US.”
In a post-mortem on its blog, OneLogin said, “Through the AWS API, the actor created several instances in our infrastructure to do reconnaissance. OneLogin staff was alerted of unusual database activity around 9 am PST and within minutes shut down the affected instance as well as the AWS keys that were used to create it.”
Security common sense, policies still have place in multi-cloud
David Gildea has worked in IT for 17 years, and is the founder and CEO of CloudRanger, a SaaS-based server management provider based in Ireland. He said that enterprises often don’t know that they must take the same precautions with cloud vendors as they do with their data center and on-premise IT. Part of this is ensuring that the vendors they work with have just enough access to get their job done, he said.
“If you have access to this one tool and it gets compromised then it’s a huge problem for enterprises,” Gildea said.
Part of the problem that he sees is that enterprises don’t have the right security policies in place when they enter the cloud, and then the problem is perpetuated as more workloads are spun up across clouds.
“What happens over and over again is you get this proof of concept that turns into a production application and there are no standards or policies set at the very beginning so things start off on a bad footing and that spreads to other clouds,” he said.
Along with the lack of security policies there is also a lack of testing, Gildea said.
“What we see [is that] things just aren’t tested; business continuity for example. Everyone has backups in some way shape or form but what happens is it’s not tested. There is a great assumption there that the cloud is going to do everything for you.”