Project Report - Team 3
Project Report - Team 3
Name Student ID
Julia Im 102719887
Quoc Trung Pham 104480583
Ahnaf Tajwar Afif 104359094
1
Table of Contents
2
Consultation Phase
Project description
The aim of this project is to develop a fully automated DevOps pipeline which utilizes AWS
services whilst allowing the system to run through a continuous integration (CI) and continuous
delivery (CD). The pipeline would serve as a way to streamline all coding, testing and
deployment whilst maintaining efficient software updates and minimizing the need to have
manual intervention.
When attempting to create a solution to address the needs of the DevOps pipeline, it is
fundamental that there are a variety of choices to choose the most optimal tools to achieve a
seamless and effective pipeline. Tools that have been considered for development would be
AWS (Amazon Web Services) that would include CodeCommit, CodeBuild, CodePipeline,
CloudWatch and Elastic Beanstalk to provide a seamless workflow of code management and
deployment.
About the selective tools, AWS CodeCommit is a source control service which is fully
manageable and enables users to manage each of their code repositories in an environment
which is secure and scalable. With the support of Git which allows software developers to
collaborate on coding, CodeCommit works alongside other AWS and further enables automated
workflows.
Following is AWS CodeBuild which helps the integration of a continuous service and allows the
automation for packaging, building and testing the code. The code is then able to be compiled
which can be tested through unit testing and creates artifacts for when deployment is ready.
AWS CodeBuild allows the freedom of scalability as there is no need to manage build servers
when it is capable of scaling automatically depending on the quantity that is run
simultaneously.
Next is the AWS CodePipeline which would serve as the continuous delivery part of the DevOps
pipeline and automate the build, tests and the process for releases when delivering software.
AWS CodePipeline works alongside CodeCommit through CodeBuild for deployment which
ensures a stable release process and helps update software rapidly without compromising on
reliability. It also has the capability to integrate with third party tools such as GitHub and
Jenkins.
In terms of performance tracking and metrics, AWS CloudWatch is a monitoring service that
allows real time observations and insights into the health and performance of the infrastructure
and its applications. CloudWatch is able to log all the metrics and events which effectively
3
minimizes management issues by sending notifications, displays visualization trends, and
possible anomalies.
With the AWS tools selected and being able to coordinate together, another tool necessary to
build upon the DevOps pipeline foundation would be Jenkins, an open-source automation
server which is mainly known for its functionality to provide continuous integration (CI) and
continuous delivery (CD) in the DevOps pipelines. Jenkins is flexible in terms of customization
and can integrate easily with AWS. How Jenkins would fit into the DevOps Pipeline would be
that Jenkins would trigger builds once changes in code occur and are pushed to their respective
repositories. From there automated tests are conducted to always check if code is fully
functional and stable.
Altogether, these tools define the DevOps pipeline and create the foundation for a seamless
workflow of continuous integration (CI) and continuous deployment (CD). These AWS tools
collectively form an efficient workflow and introduce an integrated approach which reduces
manual adjustments and input whilst minimizing errors. The development of the DevOps
pipeline emphasizes the importance of automation, efficiency and reliability. Through these
tools which provide the capability to automate testing, building and deployment whilst
enhancing software quality, this reduces any risk of human error and manual inputs. Thus,
creating a successful pipeline that aims to support rapid development without the compromise
of maintaining stability, and overall efficiency of the deployment lifecycle.
4
performance tracking and determine whether the efficiency, speed and accuracy of the build
meets standards, otherwise it can be used to further optimize deployment.
Statement
Throughout the modern development of software, the most common challenges for developers
would be the management of frequent changes in code whilst simultaneously maintaining the
stability and performance of a system. The traditional method of manually fixing and processes
slows down the efficiency of a workflow, as well as being time-consuming and at risk of human
error. This could lead to disruptions and inconsistencies for the deployment. The development
of the DevOps pipeline addresses these issues alongside utilizing AWS services.
To meet the standards of evolving demands for modern day technological needs, there must be
a change to adapt to the quickness and reliability for testing and deployment of code changes.
This introduces the challenge of managing multiple environments (development, testing and
production), which without proper automation could lead to complex issues arising.
The end goal for the development of the DevOps pipeline is to create an efficiently running
CI/CD pipeline that proactively detects and resolves any issues whilst maintaining the rapid
updated and deployments that are free of errors as well as simplifying the management of the
various environments. By implementing automation, the DevOps pipeline project aims to
enhance software quality, system reliability and deployment frequency.
2 - Jenkins serves as an Automation Server that aids in setting up a CI\CD pipeline for handling
software building testing and deployment stages through automation.
3 - Using AWS S3 for Hosting Static Websites is a practice among developers who want a way to
manage ReactJS applications online. S3 offers hosting options for websites which's great for
front end apps that don't need a server, for content rendering.
4 - Automate Jenkins jobs, by setting up GitHub webhooks to trigger them when there are
changes in the repository without intervention needed. This follows the CI / CD principles, for
testing and deployment after any code modifications have been made.
5
Design and Development
In our original design, we were planning to utilize Jenkins CI/CD Pipeline to build and deploy a
ReactJS application from GitHub Repository to an AWS S3 Bucket with an EC2 Instance, thereby
achieving Continuous Intergration and Delivery.
To get started we would have needed usable accounts on both GitHub and AWS, we would also
had to made sure that the ReactJS project is stored in a GitHub repository, then we would set
up an S3 container on Amazon Web Services (AWS) with the permissions required for access
and hosting of a static website.
Next, we would set up a Jenkins Server by installing Jenkins (alongside Node.js) on an instance
or a server. For the setup of our AWS configuration, we would need to grant permissions for
the Security Group of the EC2 instance and S3 Bucket, an Identity and Access Management
(IAM) user, is also essential for the following steps. We would need to also generate an access
key and a secret key for this user to make sure that they are stored safely.
For the S3 Bucket, we would make sure that public access is enabled in the configuration, static
website hosting is configured and index.html is specified as the starting point. Finally, we would
set up a bucket policy that permits access to items within the bucket.
To set up Jenkins, we would have to set up webhooks in GitHub by adding a link to the Jenkins
server IP address so that any modifications in GitHub will activate the Jenkins pipeline changing
accordingly on the web application. Next, we would start setting up a Jenkins pipeline job and
connecting it to the GitHub repository while configuring the pipeline script using the Jenkins file
found in the repository.
Scripts for the pipeline would include setting up the pipeline to fetch data from the GitHub
repository, utilizing our AWS credentials, within Jenkins for verifying access credentials for the
S3 container. The configuration of the pipeline stages would be as follows, executing ‘npm
install’, building ReactJS project application, and then transferring files over to save them in the
S3 bucket.
For our application to run smoothly, we would need incorporate some tests, making sure to
activate the pipeline and double-check that the ReactJS application has been deployed without
any issues. Making changes to the content in the GitHub repository to test the deployment and
ensure that the pipeline identifies modifications and triggers redeployment to reflect the
changes.
6
Finally, we would enter into the app to access the deployed ReactJS application using the URL
of the website on the S3 bucket.
This configuration would allow for deployment by updating any changes made on the GitHub
repository, onto the S3 hosted application, demonstrating successful application of Continuous
Intergration and Delivery.
Compatibility of Design
For Integration & Design we used AWS services with GitHub, Jenkins and Docker for high
efficacy in deployments. We built the shell script and set up AWS EC2 instances to host Jenkins
server that is integrated with version control on GitHub as well for auto builds. With Jenkins
and its ability to automate everything, along with Docker in play, the application becomes one
that has been designed from day 1 for consistent deployment across all sorts of Environment
thereby reducing issues due around running or scaling it. Because Docker containers use
containerization, replication is easy to achieve no matter the OS, and performance should be
consistent.
In addition, the necessary AWS and Jenkins security measures must be compatible with
everything. A SSH keys secured access with which the GitHub repository prevents only allowed
users to submit updates. Jenkins-GitHub connection is in sync with GitHub webhooks so that
whenever there would be any changes to the repository, Jenkins can automatically know about
it. It lays stress on automation, security, and cross-cloud compatibility while working across
cloud platforms, CI/CD processes as well as application layers making it production-grade fully.
Specifications
The setup process includes configuring an Amazon EC2 instance with specific configuration for
hosting Jenkins server. Here’s a summary:
For Amazon EC2 Configuration: The instance created on AWS is with a default security group
having inbound rules to open port 8080(on Jenkins) and 8000 (for the application) opened for
external access. The following items are essential to use the set-up for automating Jenkins tasks
and deploying Docker.
Installing Jenkins: Jenkins is Java-based so we install JDK (Java Development Kit) on the
instance, and also check if it has been successfully installed. In the next step, Jenkins will be run
and configured so that it runs with the system using systemctl commands.
7
SSH Key Generation and Integration: We used the instance to create and generate an SSH key,
which could then be uploaded into GitHub in order for Jenkins to connect securely via said ssh
key with the repository.
Docker Setup: EC2 instance with Docker installed in it, a Docker file with necessary commands
to build and run that docker container.
GitHub Webhook: When the webhooks at GitHub are configured with Jenkins this helps to
trigger automatic builds and updates anytime a commit is pushed within the repository at
GitHub thus making deployment straightforward.
These specifications give initial basic setup to keep CI/CD pipeline automated from code
creation level until deployment.
Data display
The CI/CD setup for this project displays data and logs to the greater extent on Jenkins
dashboard & AWS CloudWatch. When building the console output, Jenkins will show us
everything happening in real-time with errors or success. Build history, failures and timings are
also visible within the dashboard interface to control projects deployments status as well
possible issues that appear.
Our Docker container is also writing logs when the application runs, recording several useful log
lines for app performance / system errors as it goes to runtime. We have set up AWS
CloudWatch to monitor an EC2 instance's resource usage (CPU, memory and network activity),
giving you more information about the health of your application as well insight into
performance issues.
Jenkins Console Logs: A glimpse of these logs can be seen for the successful builds about how
each stage in pipeline has been progressed and completed.
AWS CloudWatch Metrics: A snapshot of the CloudWatch metrics depicting usage of CPU,
memory or network traffic data — they help estimate a resources need by tracking load on an
application; and aids in scaling issues detection.
8
Figure: Jenkins Console Log
9
1. Docker file: It is used to containerize our application, Jenkins configuration, and a bunch
of shell commands which helps in automating the build & deployment processes.
The Docker file itself is a building block for this project, with it we can make the
application container using its rules through instructions. Some of the example content that I
would expect to see in it is instructions on getting a Node. This will use the basic node image,
set the work directory to /app/, and copy all our application files over. Then it installs
dependencies, runs tests if any exist in.
2. GitHub Webhook Setup: The GitHub webhook setup is intended to trigger Jenkins
operations with each code push. The GitHub Integration plugin within Jenkins allows us
to create a webhook that connects Jenkins to the repository. This link enables the
automated start of a build process when code changes are submitted.
3. Shell Commands in Jenkins: We use shell commands in the configuration part of
Jenkins, and go to Execute Shell to write the following commands which will run
automatically:
10
Figure: Jenkins Execute Shell Commands in Configuration Tab
It first checks if there’s any existing container running, if it finds any it stops and removes that
container. And then sensing any change it detects the change and builds a new container and
runs it.
11
Figure: Launching EC2 Instance
12
Figure: Creating Key Pair
13
Figure: Changing some necessary Network Settings
14
Figure: Instance Running
15
Figure: Connection to the Instance via Amazon
Now to install Jenkins we will go through the following procedures and commands.
Then,
16
Now we will install Java Development Kit because Jenkins runs on Java and verify the java installation.
17
Now, we will import the Jenkins GPG key
And now add the Jenkins Debian repository to my system’s sources list:
Now, we will update the package list again to include Jenkins packages and then install Jenkins
18
Now we will start the Jenkins service and will also enable Jenkins to start on boot by using sudo systemctl start jenkins,
sudo systemctl enable jenkins. Then we can verify the status of jenkins by using sudo systemctl status jenkins
But Jenkins runs on port 8080. So, we need to open port 8080. We can do that by editing the inbound rules of the security
group of my EC2 Production Server. And for now, I’ll configure it so that only I can access it through my IP.
19
Now if we go to the public IP of the instance:8080 we can access Jenkins
20
We will click on Install suggested plugins and let it do its thing.
And then fill out the next page, with necessary information.
21
This is the jenkins dashboard.
22
We will add a new item,
23
Before going for credentials, we will generate ssh key.
24
Now we have our private and public keys. If we want to connect GitHub to our server, we have to give it the public key. To
do that, we will go to GitHub settings and go to SSH and GPG keys and add a new ssh key.
25
Now we will select the previously made credentials.
26
Now we will click on Build now.
27
Now we can check it.
We have already uploaded the docker file with necessary commands on GitHub.
28
29
30
It gives permission error, so we will use sudo with it.
31
Now we have to give access to port 8000 by adding it into the inbound rules of the Production server.
Now we will run it on a docker container. So that it doesn’t stop running every time we close it. We will build and run the
docker. For that, we will first install the docker.
32
Then build the docker.
33
Now we will run it through jenkins by configuring it inside the Jenkins. We will first kill the docker.
34
We can see from the console output that the build is successful.
35
Now we will use Webhook.
First, we will install a plugin in Jenkins named GitHub Integration.
In the meantime, we will configure our Webhook in GitHub. And we will give access to port 8080 to anywhere IPv4 which
was previously configured as my IP.
36
Now we will select the following. And save the new change in the configuration.
37
Now if we change the code in GitHub, it will automatically trigger a new build in Jenkins and show the new app, in the
browser.
We change it to the following and save the file. It automatically triggers a new build and successfully changes and updates it
in the browser.
38
We can see that the change has been made by GitHub push.
39
We can add a new heading, and it will do the change.
40
And now this is the functionality of the app.
41
And if I click on the cross button, it is removed.
We can also monitor the Production Server using Amazon CloudWatch.
42
And this is the app link
http://3.81.109.27:8000/
This instance is running on a personal AWS account, so the instance is currently off to save resources. But we
demonstrated that during the presentation. This link is here just for the inclusion of the project report.
43
Project Outcomes
Our AWS CI/CD Pipeline project was done using Jenkins and focused on streamlining code
management and testing for deployment in an Amazon EC2 instance. Each step was executed
with precision to facilitate updates and ensure application stability.
The project started by uploading the code to a GitHub repository, for version control and team
collaboration purposes and integrating it with Jenkins for automated deployment processes.
Following the repository setup was the launch of an Amazon EC2 instance that was tailored to
serve as the production server environment with default configurations, in place to support the
CI/CD pipeline operations.
After getting the EC2 instance and running smoothly we installed Jenkins to handle all the
automation duties, in the CI / CD pipeline. We chose Jenkins for its adaptability and seamless
integration functionality especially when it comes to handling automated testing and
deployment processes. To set up Jenkins we followed a series of steps such as configuring the
Jenkins package repository and ensuring that Jenkins starts automatically upon system boot-up.
As Jenkins operates using port 8080, we implemented security measures by adjusting the rules
of the EC2 instance security group to allow traffic through this port. This allowed for a
connection, to Jenkins making it easier to manage remotely using a web browser.
By opening the Jenkins dashboard, we can start setting up the project by creating a new project
and connecting it to GitHub. Using the generated SSH keys on the EC2 instance and the public
key to the GitHub account we are able to create a secure connection without passwords,
between GitHub and Jenkins. Ensuring smooth code integration without any issues.
Additionally, in the Jenkins settings page, we have inputted our GitHub credentials to set up the
project to automatically fetch code updates.
The following step included the utilization of Docker to guarantee that the application could
function reliably in a controlled environment setting. A Docker file containing the settings for
the construction and execution of the application in an environment was added to the GitHub
repository. Once Docker was set up on the EC2 server instance a container was launched using
the Docker file specifications with security settings implemented to allow access, through port
8080. This setup enabled the application to run autonomously from its hosting environment,
enhancing its resilience and security.
When Docker was used to run the application on Jenkins, we were able to automate
deployment tasks by configuring build commands in the Jenkins shell for deployment and
operation of the application within the Docker environment without the manual intervention
44
required for each build cycle completion. This results in increased reliability and decreased
dependence on manual processes.
To automate the process efficiently and effectively, a GitHub webhook was set up to prompt
Jenkins whenever code updates were made. Once the GitHub Integration plugin was installed
on Jenkins and the webhook was configured on GitHub each time a new code update was
pushed, Jenkins would promptly initiate a build process to redeploy the updated application.
This enabled the production environment to mirror the recent changes seamlessly. This
automation significantly enhanced the deployment workflow and accelerated deployment
speed.
The project monitored its performance through Amazon CloudWatch to get real-time insights
into how the EC2 instance and the application were doing. It was a tool for keeping an eye on
metrics and managing the system efficiently.
A major issue we encountered during the configuration was the constant failure to create a test
environment, however, as our deployment proved successful and effective we have decided to
forgo the testing environment and simply monitor the build using Jenkins and Docker to check
for any errors or issues.
In general, the AWS CI / CD Pipeline has effectively showcased the potential of an automated
workflow for deployment. Utilizing GitHub, Jenkins, Docker, and AWS as the core components
of the setup was key. This drastically increased deployment speed and reduced downtime,
thereby enhancing the stability and speed of the production environment. By employing
automation and containerization techniques the project demonstrates an efficient deployment
strategy.
45