Deploying a WAR file on AWS Elastic Beanstalk

Deployment part

Deploying a WAR file on the Amazon Cloud platform can be easily done using their service “Elastic Beanstalk”.

Elastic Beanstalk is a service that allows to easily deploy and manage in the AWS Cloud without having to take care of the infrastructure where the code would be deployed. The idea is to simply upload the packaged Web Application file (or indicate its location using an S3 bucket) and Amazon will take care of the rest.

Elastic Beanstalk supports applications developed in Go, Java, .NET, Node.js, PHP, Python, and Ruby. When you deploy your application, Elastic Beanstalk builds the selected supported platform version and provisions one or more AWS resources, such as Amazon EC2 instances, to run your application.

Elastic Beanstalk uses a model based on versions and environments so that you can easily switch a version deployed to an environment or add/remote an environment for your application:

Figure 1 https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/Welcome.html

A configuration is composed by many AWS entities and indicates:

  • What would be the EC2 instance hosting the Web Application (can be selected within the large choice of EC2 types from AWS i.e: t2.micro, t2.large, etc.);
  • The software used;
  • The auto-scale policy;
  • The rolling policy:
    • All instances are being updated at once
    • Instances are being updated in batch (so is rollback)
    • New instances are being booted up in parallel and switch is done if health checks are okay
  • Etc.

Application can be easily created using the AWS console.

The selection of the platform is done at the environment level where Amazon supports a lot of technologies:

In my example, it’s a Spring Boot + Angular (generated using JHipster) project which is being deployed as a WAR file on a Tomcat instance.

Source code can be uploaded directly from the AWS console:

And once it is loaded, the environment will boot up. Which means Amazon AWS will set up all the required artifacts so that the Web Application can be up & running:

  • EC2 instance;
  • IAM roles;
  • Load balancer if required;
  • RDS database can be included within the EBS instance can will be terminated in case the environment is deleted so this is not best practice for production Web Application.

Elastic Beanstalk is free of charge; only the resources created will be charged which is an interesting point.

Once the environment is booted up, AWS will provide a basic health check and the public endpoint where the application can be accessed from. Events can also be tracked from the same page:

Figure 2 https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/environments-console.html

Once the environment is up and running, a new version of the code and be uploaded directly from the console. AWS will store the application package on a S3 bucket and will deploy the new version to the environment (following the deployment policy specified in the environment configuration).

Managing version release from an AWS Pipeline

Having the Web Application being manually deployed from the AWS console is not really the preferred option in companies. The idea is to wrap up the build, eventual test cases & code checks and the deployment in a CI/CD Pipeline and this can be achieved using AWS solutions.

Building the code

Since the example component had another Spring Boot project as dependency, I had the need to use an AWS CodeArtifact referential which is basically a shared repository handled on Cloud:

CodeArtifact also helps the developer with the required information on how to use this repository in the application code; for Maven, it can easily be done using the proposed steps from Amazon:

The goal is to make sure all the dependencies will be fetched and stored on the CodeArtifact repository.

Accessing the repository is done thanks to an authorization token so keep in mind that the IAM policy requires get-authorization-token rights:

Once the CodeArtifact repository is setup, the CodeBuild can now be created.

I will skip most of the details that are being used while creating the CodeBuild but basically:

  • Source code is coming from GitHub. Amazon can also work with AWS CodeCommit, S3, BitBucket or an enterprise GitHub;
  • The AMI used was Amazon Linux 2;
  • Artefact are being pushed to S3.

The important piece is the buildspec file used by Amazon to customize the build commands. Buildspec can be either put at the root level of the project or specified in the AWS Console directly:

Some explanations:

  • The pre_build command is used to generated an authorization token that will be used to interact with the repository;

The settings.xml file is part of the project so that developer can include specific Maven configurations; in our case the AWS CodeArtifact info:

  • And it is being copied to the cache;
  • The build commands are actually the command used in the Maven build;
  • Important information is the renaming step of the WAR file. I got errors if I was not working with a file named “ROOT.war”. It should also match with the information specified in the .ebextensions configuration file (see below);
  • The m2 repository is cached so that the next build will not download all the artifacts again;
  • Output artifacts are the war file and also the .ebextensions file(s) that has to be sent to the EBS EC2 instance. In short: “You can add AWS Elastic Beanstalk configuration files (.ebextensions) to your web application’s source code to configure your environment and customize the AWS resources that it contains. Configuration files are YAML- or JSON-formatted documents with a .config file extension that you place in a folder named .ebextensions and deploy in your application source bundle.” (quoted from: https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/ebextensions.html).

In my example, I had a fix-path.config file on .ebextensions folder at the root level of the project which contains

And was used to extract files within the ROOT.war file into the tomcat application folder on the EC2 instance.

Creating the final Pipeline

Once all the pieces are created, we still need to assemble them.

CodePipeline was used to this purpose and was composed by 3 steps:

  1. Source: fetching the source code from GitHub
  2. Build: Using the CodeBuild configuration created previously, running the Maven build and getting the artifacts generated from it
  3. Deploy: Deploying the generated code using the output artifacts from step 2 to the Elastic Beanstalk application creation at the beginning of the document

Important notes:

  • Pipeline can automatically create GitHub webhooks so that it can trigger a Pipeline execution every time a commit is pushed.

Artifacts from steps 2 are pushed to EBS as specified in the configuration of the step:

So basically, ROOT.war file and .ebextensions/*.config files are copied to the EC2 instance.

Author: Jason David

Samuel Vandecasteele

Samuel Vandecasteele

Curious to know more about this topic?

Working at i8c

i8c is a system integrator that strives for an informal atmosphere between its employees, who have an average age of approx 30 years old. We invest a lot of effort in the professional development of each individual, through a direct connection between the consultants and the management (no multiple layers of middle management). We are based in Kontich, near Antwerp, but our customers are mainly located in the triangle Ghent-Antwerp-Brussels and belong to the top 500 companies in Belgium (Securex, Electrabel, UCB, etc…).

Quality Assurance

i8c is committed to delivering quality services and providing customer satisfaction. That’s why we invested in the introduction of a Quality Management System, which resulted in our ISO9001:2000 certification. This guarantees that we will meet your expectations, as a reliable, efficient and mature partner for your SOA & integration projects.

i8c - ISO9001-2015

Also worth reading

AWS AppFlow: Streamlining SaaS Integrations with AWS Services

In today’s digital world, organizations are constantly looking for ways to streamline their workflows and improve their data management processes. One of the key challenges that organizations face is integrating their various software as a service (SaaS) applications with their data management systems. This is

Read More »

Apigee Scope Validation using OpenAPI Specification

In API security and management, we often use a lot of different security mechanisms to protect the requested resource behind the API Gateway. One of these mechanisms is the validation of scopes to authorize a client on a specific sub-resource of the API. Most of

Read More »