Integrating AWS S3 with MuleSoft

Integrating AWS S3 with MuleSoft

AWS is one of the most popular Cloud platforms. Check a detailed post on AWS here:

AWS Cloud Computing
AWS and its use.
AWS S3

Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, and performance.

Benefits

  • Industry-leading performance, scalability, availability, and durability
  • Wide range of cost-effective storage classes
  • Unmatched security, compliance, and audit capabilities
  • Easily manage data and access controls
  • Query-in-place and process on-request

The AWS S3 Connector

Amazon S3 Connector v5.6.x
Requires Mule 4.1.1 or higher

Anypoint Connector for Amazon S3 (Amazon S3 Connector) provides connectivity to the Amazon S3 API, enabling you to interface with Amazon S3 to store objects, download and use data with other AWS services, and build applications that require internet storage.
The Amazon S3 connector is built using the SDK for Java.


Connector Configurations
  1. First import the AWS S3 connector from Exchange

2.  Create a Connector configuration as below

Now, we will use this configuration to use some of the available operations.


Some available operation in the connector
  • List Buckets - Returns a list of all Amazon S3 buckets that the authenticated sender of the request owns.
List Buckets

This returns all the buckets in a given account. So, we can add some data weave to filter or change the output to be responded back

  • Create Bucket - Creates a new bucket. The connector must not be configured as anonymous for this operation to succeed. Bucket names must be unique across all of Amazon S3, that is, among all Amazon S3 users.
  • Delete Bucket - Deletes the specified bucket. All objects in the bucket must be deleted before the bucket can be deleted. This can be overridden by setting force = true.

The below configuration won't delete the bucket unless we delete all contents in it.

Delete bucket without force delete

We have an option of force delete where the bucket will be deleted even if there are some contents in it. The below configuration will work

Delete bucket with force delete

Note - The force option supports dataweave as input so we can handle it dynamically based on certain conditions.

  • List Objects - Lazily lists all objects for a given prefix. Because S3 does not limit the number of objects, this list can retrieve an arbitrary amount of objects

List objects allow us to add prefixes markers etc. We can add a dataweave code as shown above to selectively display certain content.

  • Create Object - Uploads an object to S3. Supported contents are input streams, strings, byte arrays, and files.

The Create Object configurations allow a lot of tweaks. We can set some object to be publicly available using Canned ACL options. Here, I am keeping it private

Create Object Flow
  • Delete Object - Deletes a given object. Only the owner of the bucket containing the version can perform this operation.

Some common use cases

Automate common business operations by integrating Amazon S3 with other business applications such as ERP, analytics systems, and data warehouse systems. Some examples are:

  • Build apps with native cloud-based storage - Connect your mobile apps to scalable Amazon S3 buckets to store files, images, and so on.
  • Backup and archive critical data - Leverage the Amazon S3 connector to seamlessly integrate with your ERP, CRM, EDI, and fulfillment systems and archive necessary data.
  • Drive business intelligence and optimize operational outcomes - Leverage S3 as a data lake and extract valuable insights using query-in-place, analytics, and machine learning tools.

I will cover more operations in a separate blog! Cheers.