Continuous Deployment of AWS Lambda with Java Runtime

Anton Bakalets
Nerd For Tech
Published in
5 min readJan 6, 2021

--

This is rather a note to myself on how to setup CI/CD for Lambda written in Java.

Nowadays implementing ideas and concepts that just need to be proved can be done using serverless approach. Such approach has a lot of benefits: no infrastructure management and pay-as-you-go payment model, to name a few. But before you start coding your business logic, you need to make sure code changes are build and delivered as quickly as possible.

AWS CodePipeline is a service that orchestrates steps required to continuously deliver your serverless application. Basically those steps include pulling code from CodeCommit, compiling and packaging the code with CodeBuild and finally deploying packages with CodeDeploy. But in case of AWS Lambda the deployment process is a little bit different and also involves Amazon S3, CloudFormation and AWS Serverless Application Model.

Lets say our goal is to perform automatic deployment of a Lambda function with Java codebase hosted in CodeCommit on each commit. AWS provides few CodePipeline tutorials. The one describing how to publish serverless application to the AWS Serverless Application Repository is the closest one to help achieve our goal, but it needs some minor updates and comments to make it work for Lambda and Java code.

Before you begin

You must already have or create as you follow this tutorial:

  • A CodeCommit repository with some sample Lambda code written in Java, you can use examples provided by AWS. You can also use code examples in GitHub repository created for this article.
  • A S3 bucket to store code deployment packages.
  • You should be familiar with AWS CloudFormation and AWS Serverless Application Model (AWS SAM)

Step 1: Create a buildspec.yml file

Create a buildspec.yml file with the following contents, and add it to your CodeCommit repository root folder. Thesam-template.yml is your application's AWS SAM template and bucketname is the S3 bucket where your packaged application will be stored.

version: 0.2

phases:
install:
runtime-versions:
java: corretto11
build:
commands:
- mvn package
- sam package
--template-file sam-template.yml
--s3-bucket bucketname
--output-template-file packaged-template.yml
artifacts:
files:
- packaged-template.yml

The most important here is to understand the difference between sam-template.yml andpackaged-template.yml files. The sam package command creates a .zip file of your code, and uploads it to S3. It then returns a copy of your AWS SAM template (packaged-template.yml), replacing references to local artifacts with the Amazon S3 location.

Few other items to note here:

  • You might have notice that Maven is used to build the shaded jar containing all required dependencies, but Gradle can be used as well.
  • The runtime-versions section specifies version 11 of Java. Amazon recommends correto11 for the Amazon Linux 2 standard image and openjdk11 for Ubuntu standard image.
  • Please note that buildspec.yml version 0.1 doesn’t support runtime-versions section.

Step 2: Create and configure your pipeline

This step completely repeats Step 2: Create and configure your pipeline from the tutorial. Just make appropriate changes if you are using CodeCommit instead of GitHub and Amazon Linux 2 operating system instead of Ubuntu.

Just do not forget to add a new policy statement to allow CodeBuild to put objects into the S3 bucket where your packaged application is stored.

Step 3: Create deploy action

Follow these steps to setup the pipeline action responsible for creation of CloudFormation stack that contains the Lambda function.

  1. Open the CodePipeline console.
  2. In the left navigation section, choose the pipeline that you want to edit.
  3. Choose Edit.
  4. After the last stage of your current pipeline, choose + Add stage. In Stage name enter a name, such as Deploy, and choose Add stage.
  5. In the new stage, choose + Add action group.
  6. Enter an action name. From Action provider, in Invoke, choose AWS CloudFormation.
  7. From Input artifacts, choose BuildArtifact.
  8. From Action Mode, choose Create or update a stack.
  9. Choose your Stack name.
  10. In the Template section from Artifact name, choose BuildArtifact and the File name should be packaged-template.yml that is the result of SAM transformation mentioned above.
  11. You need to add two Capabilities: CAPABILITY_IAM to acknowledge that you want to allow Cloudformation to modify IAM items and CAPABILITY_AUTO_EXPAND as the template contains AWS::Serverless transforms, which is a macro hosted by AWS CloudFormation.
  12. You need to create a role to allow CloudFormation to perform required actions: download artifact from S3, create Change Set, create Lambda function and create a role for the Lambda function defined in template file.
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"lambda:CreateFunction",
"lambda:GetFunction",
"lambda:DeleteFunction",
"lambda:UpdateFunctionCode",
"lambda:UpdateFunctionConfiguration",
"lambda:ListTags",
"lambda:TagResource",
"lambda:UntagResource",
"cloudformation:CreateChangeSet",
"iam:GetRole",
"iam:CreateRole",
"iam:DeleteRole",
"iam:PutRolePolicy",
"iam:AttachRolePolicy",
"iam:DeleteRolePolicy",
"iam:DetachRolePolicy",
"iam:PassRole",
"s3:GetObject"
],
"Resource": "*",
"Effect": "Allow"
}
]
}

13. Choose Done for the action.

14. Choose Done for the stage.

To verify your pipeline, make a commit and push the changes into you code repository. When the pipeline is completed you can check the resources created by CloudFormation stack and find your Lambda function.

Billing

Now that your pipeline is setup and Lambda function deployed, you might wonder what is the cost of running this infrastructure. According to the Billing service next service charges:

  • CodeCommit charges per user.
  • CodeBuild charges per build minutes.
  • S3 charges per different type of requests and per storage amount.
  • CloudWatch charges you per amount of log data.
  • Lambda charges per invocation.

Therefore, you are charged using pay-as-you-go model. For each resource in this list you are paying only when you are using it: when you trigger the build or when Lambda function is invoked. For sure, while performing quick prototyping you can manage to keep all expenses under Free Usage Tier limits.

Clean up

Amount of S3 storage usage can be reduced by deleting build packages stored in S3 bucket. The simplest way to do that is to declare lifecycle management rules on the bucket, without any additional service invocation nor IAM configuration.

Even if the cost is really low, you may want to delete all created resources to avoid any extra charge. Just delete the CloudFormation stack and it will remove of all resources like Lambda function and IAM role associated with it.

Now that your stack template is in the source code you can add new resources according to Infrastructure as a Code principle, each change either to the business logic or to the stack template will be automatically deployed by the pipeline.

--

--