Lambda Cross Account Using Bucket Policy

Introduction

This lab demonstrates configuration of an S3 bucket policy (which is a type of resource based policy) in AWS account 2 (the destination) that enables a Lambda function in AWS account 1 (the origin) to list the objects in that bucket using Python boto SDK. If you only have 1 AWS account simply repeat the instructions in that account and use the same account id.

Goals

  • S3 bucket policies
  • Resource based policies versus identity based policies

1. Identify (or create) S3 bucket in account 2

  1. In account 2 sign in to the S3 Management Console as an IAM user or role in your AWS account, and open the S3 console at https://console.aws.amazon.com/s3
  2. Choose an S3 bucket that contains some objects. You will enable the ability to list the objects in this bucket from the other account.
  3. If you would rather create a new bucket to use, follow these directions
  4. Record the bucketname

2. Create role for Lambda in account 1

  1. In account 1 sign in to the AWS Management Console as an IAM user or role in your AWS account, and open the IAM console at https://console.aws.amazon.com/iam/
  2. Click Roles on the left, then create role
  3. AWS service will be pre-selected, select Lambda, then click Next: Permissions
  4. Do not select any managed policies, click Next: Tags
  5. Click Next: Review
  6. Enter Lambda-List-S3-Role for the Role name then click Create role
  7. From the list of roles click the name of Lambda-List-S3-Role
  8. Click Add inline policy, then click JSON tab
  9. Replace the sample json with the following
    • Replace account1 with the AWS Account number (no dashes) of account 1
    • Replace bucketname with the S3 bucket name from account 2
    • Then click Review Policy
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "S3ListBucket",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": "arn:aws:s3:::bucketname"
        },
        {
            "Sid": "logsstreamevent",
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogStream",
                "logs:PutLogEvents"
            ],
            "Resource": "arn:aws:logs:us-east-1:account1:log-group:/aws/lambda/Lambda-List-S3*/*"
        },
        {
            "Sid": "logsgroup",
            "Effect": "Allow",
            "Action": "logs:CreateLogGroup",
            "Resource": "*"
        }
    ]
}
  1. Name this policy Lambda-List-S3-Policy, then click Create policy.

3. Create bucket policy for the S3 bucket in account 2

  1. In account 2 sign in to the S3 Management Console as an IAM user or role in your AWS account, and open the S3 console at https://console.aws.amazon.com/s3
  2. Click on the name of the bucket you will use for this workshop
  3. Go to the Permissions tab
  4. Click Bucket Policy
  5. Enter the following JSON policy
    • Replace account1 with the AWS Account number (no dashes) of account 1
    • Replace bucketname with the S3 bucket name from account 2

This policy uses least privilege. Only resources using the IAM role from account 1 will have access

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1565731301209",
            "Action": [
                "s3:ListBucket"
            ],
            "Effect": "Allow",
            "Resource": "arn:aws:s3:::bucketname",
            "Principal": {
                "AWS":"arn:aws:iam::account1:role/Lambda-List-S3-Role"
            },
            "Condition": {
                "StringLike": {
                    "aws:UserAgent": "*AWS_Lambda_python*"
                }
            }
        }
    ]
}
  1. Click Save.

4. Create Lambda in account 1

  1. Open the Lambda console https://console.aws.amazon.com/lambda
  2. Click Create a function
  3. Accept the default Author from scratch
  4. Enter function name as Lambda-List-S3
  5. Select Python 3.7 runtime
  6. Expand Permissions, click Use an existing role, then select the Lambda-List-S3-Role
  7. Click Create function
  8. Replace the example function code with the following
    • Replace bucketname with the S3 bucket name from account 2
import json 
import boto3 
import os 
import uuid

def lambda_handler(event, context): 
    try:
      # Create an S3 client
      s3 = boto3.client('s3')

      # Call S3 to list current buckets
      objlist = s3.list_objects(
                      Bucket='bucketname',
                      MaxKeys = 10)

      print (objlist['Contents'])
      return str(objlist['Contents'])

  except Exception as e:
      print(e)
      raise e
  1. Click Save

  2. Click Test, accept the default event template, enter an event name for the test, then click Create

  3. Click Test again, and in a few seconds the function output should highlight green and you can expand the detail to see the response from the S3 API

  4. Tear down this lab

  • Remove the lambda function, then roles
  • If you created a new S3 bucket, then you may remove it