How Visibility and Automation Could Prevent Amazon S3 Leaks update from November 2017

As Amazon releases new S3 security tools this week, experts say new approaches that embrace automated remediation could solve S3 security holes

Nicole Henderson, Contributor

November 9, 2017

6 Min Read
aws s3 bucket leak

It has been over 11 years since Amazon Web Services introduced Amazon S3 - its first service - to the market. Since then the cloud storage service has grown to become one of the most popular among more than 70 different AWS services. Enterprise customers including Netflix and Airbnb rely on S3 to backup and store petabytes of data.

As with other AWS services, S3 has gone through numerous iterations over the years, adding features and capabilities to meet evolving customer demands for data storage. But recent research has shed light on a troubling trend of admins leaving Amazon S3 buckets exposed to the public through misconfiguration.

In recent months, companies including Accenture, Verizon and election data firm Deep Root Analytics have all unwittingly exposed not only customer data, but also internal data such as cloud platform credentials, because of these misconfigured permissions.

This week AWS has made some security enhancements to help prevent S3-stored data from being shared with people who are not intended to be able to access it. The features aim to give customers more visibility into which S3 buckets are publicly accessible, and provide default encryption for objects stored in a bucket.

Complicated user interfaces, coupled with confusion around who is responsible for cloud security, can lead to exposed credentials and data.

Related:Accenture Left AWS S3 Buckets Containing Cloud Credentials Open to Public

One of the new features addresses this issue by displaying a prominent indicator next to each S3 bucket that is publicly accessible, while a new S3 Inventory report shows the encryption status of each object. The report itself can be encrypted.

Experts say this kind of visibility around security is critical, especially as organizations spread workloads across dozens of AWS accounts, or are managing data across multiple cloud vendors.  

“An enterprise with 40 Amazon accounts may have thousands or tens of thousands of storage buckets,” Dome9 CEO Zohar Alon told ITPro. Based in Mountain View, Calif., Dome9 offers a SaaS platform that provides enterprise-level security for IaaS public clouds including AWS, Microsoft Azure, and Google Cloud Platform. The Dome9 Arc platform provides monitoring, remediation, and policy enforcement.

“As we started serving more and more enterprises, financial institutions, manufacturing, we have 10 Fortune 50 companies that use our technology today, what we saw is they have multiple accounts with Amazon, some of them have hundreds of accounts, and they are also looking at other cloud vendors like Azure and Google Cloud Platform,” Alon said. “Even for a vendor itself to solve a problem, it doesn’t mean that a solution can go all the way to the customer because of the multi-cloud and the multi sense of responsibility, organizations try to figure out how and who manages those responsibilities internally.”

Related:With DevSecOps, Transparency Helps Catch Security Flaws Before Hackers Do

The complicated user interface and the sprawling nature of enterprise cloud infrastructure has given way to an ecosystem of companies like Dome9 that help customers simplify security, monitoring, and other aspects of running a cloud in AWS.

“Because Amazon is engineers thinking for engineers, they are minded to security but not minded to make it as accessible as we may think from the perspective of a security vendor,” Alon said. “Amazon’s mission is to make you use the infrastructure to its fullest extent, to grow as fast as possible, and sometimes this needs to be complemented by specific security expertise that can’t necessarily catch up as fast when they come from the vendor.”

In some cases, enterprises try to apply security strategies and technologies that would have worked in an on-premise environment to their cloud infrastructure. This can be the biggest challenge for security vendors who are trying to educate customers around new, more effective security methods, many of which can be automated, according to RedLock CEO Varun Badhwar.

“Trying to fly blind in the cloud and assuming the cloud provider is responsible for the entire security stack is a false premise to begin with,” Badhwar said. “Ultimately organizations still need to solve all the problems they had to solve in traditional data center environments, whether its vulnerability management, user monitoring and security, access control, configuration network management, and traffic visibility. However in the cloud your existing security architectures in the data center becomes irrelevant; you can’t just plug in a Cisco or a Palo Alto Networks next gen firewall and expect that to protect you against S3 buckets being open for example.”

Founded in 2015, RedLock helps enable this visibility for its customers through its RedLock Cloud 360 platform, which monitors cloud security across multiple cloud platforms. Its dashboard lets users pinpoint issues to drill down to the root cause of an incident, offers auto-remediation, and uses AI to evaluate risky resources in the cloud, making sure high-priority issues are bubbled up to the top and dealt with immediately.

“Legacy customers are still trying to understand how to enable their traditional security operations centers to have the visibility in the cloud,” Badhwar said. “When a problem has surfaced, can my security operations center understand that a developer even made a change that introduced a security risk? That fundamental visibility is lacking today.”

In the cloud, users have more permissions than they traditionally would have had, Badhwar said, including making code changes and pushing them into production. In a report released in October, RedLock found that 53 percent of organizations had exposed cloud storage to the public.

“I think what I was surprised about there was in May we had released the numbers and they were closer to 40 percent, and despite all the newsworthy items, despite the fact that cloud providers like AWS have made strong efforts to educate the market on the problem that we’ve actually seen a spike there, versus a reduction,” he said.

The study also found compromised administrative user accounts at 38 percent of organizations.

“Because the cloud administration pages are sitting on the web, whereas in a traditional data center they were behind a firewall so you had an additional layer of control, in the new world you don’t so you lose your credentials, somebody will login to it, they don’t need firewall access into your VPN. So that’s a serious problem,” he said.

Badhwar recommends organizations look for the low-hanging fruit that must be solved immediately, addressing their configuration and compliance posture, and vulnerability management. Getting more comfortable with automating the remediation of security issues should also be considered.

“That’s exactly the sort of notion of DecSecOps, when you’re moving at a higher miles per hour in the cloud, you must automate detection and remediation of security misconfigurations and security threats and that’s certainly how we’ve architected our platform” he said.

 

Read more about:

Amazon

About the Author

Nicole Henderson

Contributor, IT Pro Today

Nicole Henderson covers daily cloud news and features online for ITPro Today. Prior to ITPro Today, she was editor at Talkin' Cloud (now Channel Futures) and the WHIR. She has a bachelor of journalism from Ryerson University in Toronto.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like