Effectively Testing Our AWS S3 Utilization Using The S3Mock

Timo Eckhardt
Adobe Tech Blog
Published in
4 min readDec 12, 2017

--

At Adobe we’re building services for the cloud. Hundreds of these are serving customer data and enable workflows for Creative Cloud, Document Cloud or Marketing Cloud. These services are hosted in cloud providers such as Amazon Web Services or Microsoft Azure and we use provider-specific APIs and services like AWS Simple Storage Service or Azure Blob storage.

Practicing continuous integration and continuous deployment, we need our test automation to run as fast as possible and as early as possible. The majority of such tests will be Unit Tests to verify isolated blocks of code.

On top of that, you want to test service functionality as a whole, also during the build. That means that this service has to be up and running and able to interact with all its collaborators, including services managed by the cloud provider. Basically this is what the Test Pyramid is about.

Testing against AWS S3 API

Back in the days, we ran such service tests against the service to be tested, which connects against managed backend services as such AWS S3. Doing so, we faced some disadvantages:

  • These tests are potentially long-running as they depend directly on managed services that include network latency, especially when running the build and tests on a local developer machine.
  • Tests rely on the availability and reachability of managed services. If a test run, for whatever reason, cannot access managed services the build will fail as a result and we’re potentially not able to roll out new versions.

What we wanted to have, ideally, is a service that implements AWS S3 APIs and which can be started during the build with a minimal footprint.

There are various S3 API implementations out there, but none of those we evaluated were able to give us all of the S3 functionality we needed. In short the features we need are:

  • Getting and manipulating S3 Objects
  • Getting and manipulating S3 Buckets
  • Multipart uploads and copy of S3 Objects
  • KMS based encryption
  • HTTPS support

At this point we decided to implement our own mock implementation of AWS S3 API, the S3Mock.

Features

S3Mock implements basic functionality of Amazon’s AWS S3 API. The following is an overview of implemented S3 operations and features.

HTTPS Support

S3Mock provides an HTTP and an HTTPS endpoint. Defaults are

  • 9191 (HTTPS)
  • 9090 (HTTP)

Ports can be configured on startup by defining variables

server.port=9191 (HTTPS)http.port=9090 (HTTP)

The SSL certificate used is a self-signed one for localhost which may need to be ignored by the caller.

Working with S3 Objects

  • HTTP GET to retrieve an existing object, including range requests.
  • HTTP HEAD to retrieve metadata for an existing object.

Supported object metadata are:

  • ETag
  • Content-Type
  • Content-Length
  • Last-Modified
  • Content-Range for range requests

Manipulating S3 objects:

  • HTTP PUT operations for creating, updating or copying objects including Multipart Uploads.
  • HTTP DELETE for deleting existing objects.

Working with Buckets

Get Buckets:

Manipulating Buckets:

Create initial Buckets:

S3Mock will create the set of Buckets on startup passed in initialBuckets environment variable.

initialBuckets=Bucket-1,Bucket-2

KMS

Every HTTP request being invoked is filtered for server side encryption headers.

If x-amz-server-side-encryption is aws:kms then x-amz-server-side-encryption-aws-kms-key-id has to specify a registered KMS key reference. If this key reference was not found (or is empty) S3Mock will return HTTP 400 Bad Request.

Registering KMS key references is done by setting the validKmsKeys environment variable specifying key references (comma-separated):

validKmsKeys = arn:aws:kms:us-east-1:47110815:key/c51fdeea-f623-4a2b-90b5-15d72963cf9d, arn:aws:kms:us-east-1:47110815:key/c4353c4c-3318-460a-bdcc-b0a57bd8d9d8

Getting started

Run within a Test

The standalone server is available in Maven-central. Add the following dependency to start using it:

<dependency>
<groupId>com.adobe.testing</groupId>
<artifactId>s3mock</artifactId>
<version>1.0.4</version>
<scope>test</test>
</dependency>

In your test, you can use the JUnit rule com.adobe.testing.s3mock.S3MockRule to automatically start and stop the S3Mock server and to get a properly configured AmazonS3 instance to interact with S3Mock via HTTPS.

@ClassRule
public static S3MockRule S3_MOCK_RULE = new S3MockRule();
private final AmazonS3 s3 = S3_MOCK_RULE.createS3Client();

Alternatively you can start the server within your code:

com.adobe.testing.s3mock.S3MockApplication#start();

This will start S3Mock listening on the default ports. To use random free ports, use:

S3MockApplication s3Mock = com.adobe.testing.s3mock.S3MockApplication#start("--server.port=0", "--http.port=0");
int httpsPort = s3Mock.getPort();
int httpPort = s3Mock.getHttpPort();
// do your tests
// ...
// shutdown server
s3Mock.stop();

Run as a Docker container

The Docker image is available on Docker Hub at https://hub.docker.com/r/adobe/s3mock/.

docker run -p 9090:9090 -p 9191:9191 -t adobe/s3mock

This will start S3Mock within a Docker Container exposing the ports 9191 (for HTTPS) and 9090 (for HTTP).

Using The AmazonS3 from AWS Java SDK

final AmazonS3 httpAmazonS3 = AmazonS3ClientBuilder.standard()
.withCredentials(awsCredentials) // use any credentials here
.withEndpointConfiguration(
new EndpointConfiguration(
"http://localhost:9090/",
Region.US_Standard.getFirstRegionId()
)
.build();

Contributions

Adobe S3Mock is an Open Source Project under Apache License 2.0 and is available on Github. Any contributions are welcome!

To contribute, fork the repository, make your changes, and create a pull Request against the forked repository. Adobe engineers will review your changes and release new versions.

--

--

Software Engineer living in Hamburg, GER. #Java, #Golang, #Docker, #Cloud, #Architecture. Love Music, Skateboarding, Surfing, Tattoos.