Our Blog

AWS Lambda + S3: Trigger image resize upon upload to S3 Bucket
AWS Lambda + S3: Trigger image resize upon upload to S3 Bucket

AWS Lambda + S3: Trigger image resize upon upload to S3 Bucket

Amazon Web Services (AWS) offers a plethora of services that allow developers to build sophisticated applications with increased flexibility, scalability, and reliability. Among them, AWS S3 and Lambda are particularly powerful when used in combination.

AWS S3, or Simple Storage Service, provides an object storage service that offers industry-leading scalability, data availability, security, and performance. This means that it allows you to store and retrieve vast amounts of data from anywhere on the web.

Meanwhile, AWS Lambda is a part of Amazon's serverless computing services. It allows you to run your code without provisioning or managing servers. You pay only for the compute time that you consume and there is no charge when your code is not running. Essentially, Lambda lets you focus more on your application logic and less on infrastructure management.

By integrating AWS S3 with Lambda, we can actually automate a lot of tasks. For instance, one potent application of S3 and Lambda is to automatically resize images when they're uploaded to an S3 bucket. This is incredibly useful in scenarios where there are limitations on image sizes, such as for web pages or databases.

Section 1: Creating an S3 bucket using Python on AWS

Creating an S3 bucket is a straightforward process on the AWS Management console. However, you can also achieve this by executing Python code, especially when you need to automate the process.

# import the boto3 library, AWS SDK for Python import boto3 from PIL import Image import io # Initialize the S3 client s3 = boto3.client('s3') # Define the destination bucket for resized images DESTINATION_BUCKET = 'my-destination-bucket-09' def lambda_handler(event, context): try: # Parse the S3 event source_bucket = event['Records'][0]['s3']['bucket']['name'] key = event['Records'][0]['s3']['object']['key'] # Download the image download_path = '/tmp/{}'.format(key.split('/')[-1]) s3.download_file(source_bucket, key, download_path) # Resize the image output_path = '/tmp/resized-{}'.format(key.split('/')[-1]) with Image.open(download_path) as img: img = img.resize((200, 200)) # Resize to 200x200 pixels img.save(output_path) # Upload the resized image to the destination bucket resized_key = f'resized/{key}' s3.upload_file(output_path, DESTINATION_BUCKET, resized_key) return {'statusCode': 200, 'body': f'Resized image uploaded to {DESTINATION_BUCKET}/{resized_key}'} except Exception as e: print(f"Error: {str(e)}") return {'statusCode': 500, 'body': f'Error processing file: {str(e)}'}
Python script for creating Bucket

This Python snippet leverages `boto3` Python library, allowing Python developers to write software that makes use of AWS services like Amazon S3, Amazon EC2, and others.

In the script, boto3.resource('s3') is initializing a client that represents Amazon S3. A low-level, service-oriented client is commonly used for service-level actions.

This script communicates with the S3 client to execute `create_bucket` method. Finally, the `create_bucket` function is called with a user-defined name. Remember, bucket names are globally unique, meaning your bucket name must be distinct from all existing bucket names on Amazon S3.

The python automation process effectively avoids repeating the manual steps of creating a bucket via AWS Management console. These automated steps are particularly useful when you have to frequently create and delete buckets. This script can be executed at any level of application without the need to access AWS console.

Note:
You need to have necessary privileges to create an AWS Bucket. Also, keep the destination bucket DIFFERENT from the source bucket. Otherwise, lambda would be called infinite times when resized image is uploaded to same bucket's resized folder

Section 2: Step by Step Instructions for AWS Lambda for resizing images upon upload to S3

Write the Lambda Function
1. Open a code editor and create a file named lambda_function.py.

2. Add the following code to resize images using the Pillow library: import boto3 from PIL import Image import io def lambda_handler(event, context): # Parse the S3 event s3 = boto3.client('s3') bucket_name = event['Records'][0]['s3']['bucket']['name'] key = event['Records'][0]['s3']['object']['key'] # Download the image download_path = '/tmp/{}'.format(key.split('/')[-1]) s3.download_file(bucket_name, key, download_path) # Resize the image output_path = '/tmp/resized-{}'.format(key.split('/')[-1]) with Image.open(download_path) as img: img = img.resize((200, 200)) # Resize to 200x200 pixels img.save(output_path) # Upload the resized image to the same bucket s3.upload_file(output_path, bucket_name, f'resized/{key}') return {'statusCode': 200, 'body': f'Resized image uploaded as resized/{key}'}

3. Download the Pillow library in the current directory using the following code pip3 install pillow -t . Download the Pillow library locally

4. Zip the function and dependencies: zip -r resize_images.zip . Create the Zip package

Create the Lambda Function
1. Login into AWS Console. Login into AWS Console

2. Navigate to the Lambda service in the AWS Management Console. Navigate to Lambda Service

3. Click Create function and choose Author from scratch. Navigate to Lambda Service

4. Enter "resize_images" as the function name.
Choose Python 3.x as the runtime.
Select "Create a new role with basic Lambda permissions" for permissions.
Click Create function. Navigate to Lambda Service

Upload the Zip Code
1. In the Lambda function console, go to the Code section.

2. Click Upload from > .zip file and upload resize_images.zip. Upload the Deployment packages

3. Select the upload zip (we created earlier) and click on the Save button Upload the Deployment packages

4. Now refresh the page (if it doesn't update the code). Refresh the Lambda

Configure the S3 Trigger
1. Scroll down to the Function overview section and click Add trigger. Setup Trigger for S3

2. Choose S3 from the trigger options. Select the bucket name my-bucket-09. For the event type, select PUT (for uploads). Click Add button Choose s3 and add trigger options

3. You will be redirected to Lambda function page. You will see the trigger added. S3 Trigger added

Adjust Bucket Permissions
1. Go to the Permissions tab in the Lambda function console. Setup Trigger for S3

2. Click on the execution role (resize_images-role-ves7r1dj) and you will be redirect to IAM Console. Update the Trigger and permissions

3. Attach the policy to allow S3 access:
Click Add permissions > Attach policies.
Search for and attach AmazonS3FullAccess (or a custom policy granting read/write permissions to my-bucket-09). Add S3Full Access Add Permissions

4. Click on add permissions button to continue Add S3Full Access

Section 3: Testing

Testing The Lambda Function
1. Upload the image in your source bucket.

2. Go to destination bucket and the resized folder. You will see the resized image uploaded there

3. In case of the exception or error, go to Cloudwatch on AWS Console and find the relevant Log stream.

Section 4: Conclusion

In conclusion, the seamless integration of AWS S3 and Lambda empowers developers to automate tasks like automatic image resizing with remarkable efficiency. By eliminating the need for server management, this combination allows developers to focus on crafting robust application logic, enhancing productivity, and delivering scalable solutions tailored to diverse business needs.

Looking for a reliable tech partner? FAMRO-LLC can help you!

Our development rockstars excels in creating robust and scalable solutions using Django, a powerful Python framework known for its rapid development capabilities and clean, pragmatic design. FAMRO’s team ensures that complex web applications are built quickly and with precision. Their expertise allows businesses to focus on enhancing their digital presence while leaving the intricacies of backend development in skilled hands.

On the deployment side, FAMRO's Infrastructure team takes charge with Kubernetes, a leading platform for container orchestration. Their deep knowledge of Kubernetes ensures that applications are seamlessly deployed, scaled, and managed in cloud environments. By automating key processes like service discovery, load balancing, and resource scaling, FAMRO’s Infrastructure team guarantees that applications not only perform well under high traffic but also remain resilient and easy to maintain. This combination of development and DevOps expertise enables FAMRO to deliver end-to-end, highly scalable solutions.

Please don't hesitate to Contact us for free initial consultation.

Our solutions for your business growth

Our services enable clients to grow their business by providing customized technical solutions that improve infrastructure, streamline software development, and enhance project management.

Our technical consultancy and project management services ensure successful project outcomes by reviewing project requirements, gathering business requirements, designing solutions, and managing project plans with resource augmentation for business analyst and project management roles.

Read More
2
Infrastructure / DevOps
3
Project Management
4
Technical Consulting