Kyle Hilgendorf

A member of the Gartner Blog Network

Kyle Hilgendorf
Research Director
3 years with Gartner
13 years in IT industry

Kyle Hilgendorf works as a Research Director in Gartner for Technology Professionals (GTP). He covers public cloud computing and hybrid cloud computing. Areas of focus include cloud computing technology, providers, IaaS, SaaS, managed hosting, and colocation. He brings 10 years of enterprise IT operations and architecture experience. Read Full Bio

Coverage Areas:

Cloud Security Configurations: Who is responsible?

by Kyle Hilgendorf  |  April 2, 2013  |  3 Comments

A Rapid7 report surfaced last week that discovered some 126 billion AWS S3 objects were exposed to the general public.  AWS has since taken a brunt of security attacks by many blogs and tech magazines for their “lack of security”.  But I have to voice as an objective analyst, this is not the fault of AWS.

Security in S3 is binary for each object.  Private or Public.  Within private, there are a number of different settings one can employ.  Private is also the default security control for all S3 objects.  The AWS customer must manually go in and configure each individual object as “public”.  There might be very good reason for doing so.  For example, companies use S3 all the time to post public information that they want to share or make accessible for the world.  S3, and other object stores, are great for posting content such as public websites, videos, webinars, recordings, or pictures.  In other words, there might be very good reason why 126 billion objects are publicly accessible.

But for those objects that should not have been made public, the question really comes down to who is responsible – the provider or the customer?  I’ll argue this is the customer responsibility.  AWS offers customers what they want.  Security or public accessibility – the customer chooses.  There are reasons for both and customers have the power to choose.  Consider how many customers would be upset if AWS took away public accessibility options from S3.  I’d bet a large percentage of S3 customers would complain as S3 is great for publishing public websites and content.

If AWS has any fault here, it is making self-service and automation too smooth and easy, but isn’t that the goal of public cloud?  It is quite easy to create a bucket policy that opens up access to all current and future objects in a bucket for anonymous users and perhaps that is what happened to some of the more critical or private data that Rapid7 found in this study.  It is possible that one admin created a bucket policy and another admin or user uploaded sensitive data into the bucket unaware of the security configuration.  But at the same time, these bucket policies can be incredibly helpful for organizations that want to expose all objects in a bucket, for instance, a public web site.  However, the AWS management console does not provide simple visibility into what objects are accessible publicly or to anonymous users.  To gain this level of insight, you will need to have an understanding of the AWS S3 API.

But, in the end, customers are responsible.  Customers will always be responsible in the public cloud for their applications and their data – beware of configurations, features, and options.  I do not argue that many objects found in the report may have sensitive information inside, unfortunately user error or confusion could have led to the accidental public exposure of such objects.  Therefore, it is paramount that organizations employing public cloud services build not only clear governance practices, but also monitoring and alerting practices to raise awareness within the organization when digital assets may be exposed or not secured in the fashion that the data warrants.

 

3 Comments »

Category: AWS Cloud Providers     Tags: , ,

3 responses so far ↓