Active study with AWS using S3

Understanding the SDK with unit testing

An understanding of the SDK is also important for the Developer Associate Certification, since it is after all, developer-centric. Whilst you won’t have to write code to pass the exam, the concepts of the SDK are still important to understand if you want to do the actual job day to day.

Playing with code samples in the SDK is a great way to get to grips with the technology and far more fun than just reading dry text from the AWS websites. If we focus on the right subjects, we can get greater insight of a technology.

Back when identifying a study plan, I mentioned it was important to have:

In depth knowledge of Lambda
In depth knowledge of S3
In depth knowledge of Dynamo

Since S3 is pretty important to know in depth, let’s talk “Learning Tests”.

Learning Tests

In a longer article, I wrote about the importance of Learning Tests to build upon your knowledge. I’ll explain how that’s relevant here.

I first stumbled across the concept of the ‘Learning Test’ — from the book Clean Code. Simply put, rather than reading the documentation for software, use an available testing framework to **prove** what the documentation says.

AWS has an SDK for every technology, and (for anything that will be on the exam), there are examples too.

So we can read through the docs, and generate unit tests to assert what the documentation says, or more to the point, what we understand them to say.

Let’s take our S3 example – we read through the documentation, and see that according to the docs, that our credentials need to match on the region when we add and delete from a bucket.

We simply express our expectations as unit tests, and see them pass and fail. We’re using them as an exploratory harness for our assumptions

    public void canAddFindAndRemoveObjectFromBucketWhenRegionMatches() throws Exception {

        String testKey = "key_name";

        AmazonS3 s3 = usEast1RegionBuilder();

        s3.putObject(US_EAST_REGION_BUCKET, testKey, tempFile());

        S3Object retrieved = s3.getObject(new GetObjectRequest(US_EAST_REGION_BUCKET, testKey));

        assertThat(bucketContents(retrieved.getObjectContent()), is("A stream of text"));

        s3.deleteObject(US_EAST_REGION_BUCKET, "key_name");

        try {
            s3.getObject(new GetObjectRequest(US_EAST_REGION_BUCKET, "key_name"));
            fail("Should not have retrieved");
        } catch (AmazonS3Exception awsS3) {
            assertThat(awsS3.getMessage().contains("The specified key does not exist"), is(true));


This was vital. By running through a series of tests, and seeming them fail, I was able to see that my existing understanding of APIs was wrong in some cases. I could see a test fail, find out from the official docs, or someone’s article and have a test I could run, at will to see the impact.

You can get access to my S3 repo here for some ideas. Note how the main folder is the SDK code that AWS provide, and my test folders build on that.

In fact, since I originally started playing around with these concept – AWS have set up their own repo for V2 SDK code – and you can see their version of S3 for V2 tests and see what they test on. Compare ideas between their tests and mine. Come up with your own tests with a mix of approaches.

Regardless if whether you look at my repo, theirs, or both, the important thing is that the learning has to come from you. You need to ‘build your own toolset’ by doing the following:

  • Look through the SDK for S3 docs – and find relevant points about how the SDK works. I’ve used Java here, but you can swap for a language you’re more familiar with.
  • Look particularly into error flows, signing requests, regions and permissions.
  • Anything you’re not sure on, write a unit test, assert on an expectation and see what happens.
  • If you’re wrong in your assumption, find out why, fix the test and document the reason.
  • Set up your own repo so you can refer back to your code in the coming weeks.


We’ve covered a lot here, and you won’t need to learn new techniques every week to keep up. Going forward, we’ll be looking more at how to distill the essence of the AWS philosophy from each of the distinct technologies.

But having the techniques to make your learning more active and less passive is where we need to start.

I hope this was helpful to you. If you have any feedback, questions, queries about this (or any) article, please feel free to contact me.

Next week we’ll be looking at AWS Security.