More than 22,000 container orchestration and API management systems are unprotected or publicly available on the internet – highlighting the reality of the risks of operating workloads in the cloud.
According to research from Lacework, the containers (Kubernetes, Mesos, Docker Swarms and more) suffer from poorly configured resources, lack of credentials and the use of non-secure protocols. As a result, attackers can remotely access the infrastructure to install, remove or encrypt any application that the company is running in the cloud.
In all, earlier this month Lacework found 22,672 open admin dashboards on the web; and more than 300 of them were unprotected by any credentials whatsoever. About 95 percent of these are hosted inside of Amazon Web Services (AWS). The firm said that it has alerted the affected companies.
“Each of these openings presents an entry-point into potentially sensitive data or an opportunity to hijack a organization’s cloud assets,” Dan Hubbard, chief security architect at Lacework, said in an interview with Threatpost. Depending on the motives of the attackers, they can perform a variety of attacks. When a user interface is open, anyone can get inside. Once inside, they can quietly use resources to mine for cryptocurrency, deposit malware or ransomware, wipe out data, grab sensitive PII, or do any of a number of things that could frustrate normal cloud operations and/or expose critical, private data.”
If containers management consoles and APIs are open and accessible, it takes only a rudimentary level of skill to do damage. However, a more sophisticated attacker can perform data-mining on the open infrastructure in order to learn more about potential attack targets, and then deploy their attacks.
“More experienced hackers will be able to navigate to critical data in the container and beyond and into the overall cloud infrastructure,” Hubbard said. “These experienced hackers know that a combination of behavioral engineering (getting anecdotal information about the way an environment is structured) along with access is the most potent combination, so they will try to get information from employees through phishing emails, bogus phone calls, and other means that prey on unsuspecting informants.”
Even if the containers have authentication turned on, an attacker can test simple brute-force password and dictionary attacks to gain access, Hubbard noted.
In terms of the footprint of affected companies, they run the gamut in terms of size and industry.
“Although we did not access any of the consoles to drive into what the targets were or dig into a level that would allow us to see if they were compromised as this was largely automated, you can see in the data that there are all kinds of organizations included,” Hubbard said. “Although we are not sharing company names, this is something that is not just about small startups or research divisions.”
Containers are increasing in popularity among DevOps users in companies of all sizes because they facilitate collaboration, which optimizes their ability to deliver code fast to virtual environments. However, Lacework noted that securing workloads in public clouds requires a different approach than that used for traditional data centers, where APIs drive the infrastructure and create short-lived workloads. In turn, they’re also becoming more interesting to cybercriminals.
“There is an inherent dynamic at play if you combine the goals of DevOps with easy-to-use tools like containers,” Hubbard noted. “Users are not abiding by security controls, and without continuous and automated monitoring of the environment, things like misconfigurations and open containers linger as undetected issues. The DevOps team is not typically organized to look for these security issues, so they keep pushing code and using containers because containers make it easier for them.”
However, the cloud operates with different types of perimeters, and movement of data is not linear, which poses new security challenges. Rampant misconfiguration of cloud storage buckets like Amazon S3s has plagued the space for years – containers are just another wrinkle.
“So much data is going into and out of S3 buckets and containers, but because it’s often happening in an ad-hoc way through connections made by APIs, no one really knows the specifics of what data is available and what it’s status is,” Hubbard said. “The same things happen at the user level; the cloud encourages broad permissions so people can make changes and use their technology investment at business speed. But overly-permissive models lead sometimes to these assets being almost globally available.”
The findings should come as a wake-up call, Hubbard said.
“This is strong evidence of where we are in the security maturity levels when it comes to deploying into public cloud,” he told us. “While, certainly there are companies deploying with extremely secure workloads running least privilege, there is also a lot happening with little guardrails, visibility, or auditing