How To Check If Your S3 Buckets Allow Public Read ACLs

Update 2017/07/20: Code's on Github under jgreenemi/DescribePublicBuckets! This script is now available in both Python and Bash to give you some flexibility.

Problem

There's been a recent wave of accidental information exposure due to users of AWS S3 buckets improperly setting up their ACLs, making it so everyone in the world can read their bucket contents. It seems like such a simple issue to avoid (and it is), but businesses are still getting in the news for having sensitive information stored in S3 yet missing the details on the bucket permissions to protect them. How do you go through all your S3 buckets to determine which ones have public ACLs?

So, go through all your buckets and disable the "Read" permissions to the "Everyone" group, right?

Solution - Bash Version

Well, that is easier said than done if you've a few hundred buckets. To make your life easier if you're trying to solve this problem right now, I've come up with a quick Bash script to tell you which of your S3 buckets have public Read and/or Write permissions, in case you haven't received a warning email from AWS about them already:

#!/usr/bin/env bash

# DescribePublicBuckets
# A script using the AWS CLI to determine if any S3 buckets in your account are using Public ACLs.
#

# If you haven't updated your AWS CLI in a while and installed it through pip, go ahead and update it now.
#pip install --user --upgrade awscli

# List out all the buckets you have, store the results in a variable.
BUCKETS_LIST=(`aws s3api list-buckets --output text | grep BUCKETS | cut -f3`)
PUBLIC_READ_BUCKETS=()
PUBLIC_WRITE_BUCKETS=()

for BUCKET_NAME in "${BUCKETS_LIST[@]}"; do
  # Describe this bucket's ACLs. If there is a "GRANTS READ" line that is followed by a
  # "GRANTEE Group http://acs.amazonaws.com/groups/global/AllUsers" line, this means Everyone has read access
  # to the bucket.
  # Checks for the same group being present with "GRANTS WRITE".

  PUBLIC_ACL_INDICATOR="http://acs.amazonaws.com/groups/global/AllUsers"
  printf "%s\n" "Checking Bucket ${BUCKET_NAME}:"

  if echo `aws s3api get-bucket-acl --output text --bucket ${BUCKET_NAME} | grep -A1 READ` | grep -q "${PUBLIC_ACL_INDICATOR}"
    then
      printf "%s\n" "Bucket ${BUCKET_NAME} allows Everyone to list its objects!"
      PUBLIC_READ_BUCKETS+=(${BUCKET_NAME})
    else
      printf "%s\n" "No public read access detected."
  fi

  if echo `aws s3api get-bucket-acl --output text --bucket ${BUCKET_NAME} | grep -A1 WRITE` | grep -q "${PUBLIC_ACL_INDICATOR}"
    then
      printf "%s\n" "Bucket ${BUCKET_NAME} allows Everyone to write objects!"
      PUBLIC_WRITE_BUCKETS+=(${BUCKET_NAME})
    else
      printf "%s\n" "No public write access detected."
  fi

  printf "%s\n" "----"
done

printf "%s\n" ""
printf "%s\n" "Buckets with READ permission issues (if any):"
printf "%s\n" "${PUBLIC_READ_BUCKETS[@]}"

printf "%s\n" ""
printf "%s\n" "Buckets with WRITE permission issues (if any):"
printf "%s\n" "${PUBLIC_WRITE_BUCKETS[@]}"

Yes it's dirty and yes, it doesn't use a JSON parsing utility, but it is a self-contained script, so no need to install anything other than the awscli, so it's good if you need a quick answer.

Solution - Python Version

Want that in Python? Try this!

#!/usr/bin/env python
# -*- coding: utf-8 -*-

# describe_public_buckets
# A Python script to use Boto3 to list out any AWS S3 buckets in your account that have public access based on their
# ACLs, either Read or Write permissions.
#

import boto3
import logging
import sys

from pprint import pprint


def describe_public_buckets():
    """
    Search your AWS account for S3 buckets that have public READ and WRITE permissions.
    The returned dictionary looks something like the following:

    {
      'READ': ['a-readable-bucket', 'another-readable-bucket'],
      'READ_ACP': ['a-bucket-with-readable-permissions', 'another-bucket-with-readable-permissions'],
    }

    :return: A dictionary with keys consisting of permission names, and values as a list of buckets that have that permission.
    :rtype: dict
    """

    # Set up logger. Feel free to change the logging format or add a FileHandler per your needs.
    logformat = ('%(asctime)s %(levelname)s [%(name)s] %(message)s')
    logging.basicConfig(level=logging.INFO, format=logformat)
    logger = logging.getLogger()
    logger.info('Starting.')

    public_acl_indicator = 'http://acs.amazonaws.com/groups/global/AllUsers'
    permissions_to_check = ['READ', 'WRITE']
    public_buckets = {}

    try:
        # Create S3 client, describe buckets.
        s3client = boto3.client('s3')
        list_bucket_response = s3client.list_buckets()

        for bucket_dictionary in list_bucket_response['Buckets']:
            bucket_acl_response = s3client.get_bucket_acl(Bucket=bucket_dictionary['Name'])

            for grant in bucket_acl_response['Grants']:
                for (k, v) in grant.iteritems():
                    if k == 'Permission' and any(permission in v for permission in permissions_to_check):
                        for (grantee_attrib_k, grantee_attrib_v) in grant['Grantee'].iteritems():
                            if 'URI' in grantee_attrib_k and grant['Grantee']['URI'] == public_acl_indicator:
                                if v not in public_buckets:
                                    public_buckets[v] = [bucket_dictionary['Name']]
                                else:
                                    public_buckets[v] += [bucket_dictionary['Name']]

        logger.info('The following buckets have public permissions:')
        logger.info(pprint(public_buckets))

        return public_buckets

    except:
        err = 'describe_public_buckets Failed! '
        for e in sys.exc_info():
            err += str(e)
        logger.error(err)

if __name__ == '__main__':
    describe_public_buckets()

Hope these help! For full install and execution instructions, follow along with the directions in the README. If you've any issues, do bring them up on the Github repo.