0% found this document useful (0 votes)
59 views25 pages

Sriram Fir

Uploaded by

Sudeshna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views25 pages

Sriram Fir

Uploaded by

Sudeshna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

An Internship Report

Submitted in Partial Fulfillment for the Award of the Degree Of

BACHELOR OF TECHNOLOGY

in
ARTIFICIAL INTELLIGENCE AND DATA SCIENCE
Submitted by

PADALA SRIRAM (20B91A5443)

Under the esteemed guidance of

M. GALEIAH
M.Tech (PhD)

Assistant Professor

DEPARTMENT OF INFORMATION TECHNOLOGY


S.R.K.R ENGINEERING COLLEGE (AUTONOMOUS)
(Approved by AICTE, New Delhi, Affiliated to JNTU University, Kakinada)
CHINNA AMIRAM :: BHIMAVARAM-534202

April-2024
STUDENT DECLARATION

I PADALA SRIRAM Regd.NO: 20B91A5443 here by declare that the internship report entitled

CLOUD ENGINEERING & DEVOPS at DATAVALLEY PVT.LTD. Submitted to Office of

the Department of Information Technology, SRKR Engineering College (A) is my original

work accomplished under the supervision M. GALEIAH and in the partial

fulfillment of the requirement for the Bachelor of Technology (B.TECH.). This work is

an independent work and any help taken from the other people has been mentioned in

acknowledgment. Any part of this report and the report as a whole, therefore, has not been

submitted in other university or academic institutions

Signature of Student
Abstract

Automate deployment of S3 bucket and static website with Terraform. Terraform configurations
define S3 bucket creation, static website hosting, access control, and lifecycle policies. Execute
Terraform scripts for consistent deployment and security adherence. Enable static website
hosting, granting public access via unique S3 endpoint. Apply access control policies for
regulated bucket access based on roles or IP addresses. Implement lifecycle policies for
automated object management, like storage class transitions and deletion. The project
underscores infrastructure as code's efficiency in AWS resource provisioning. It streamlines
deployment, ensuring scalability and maintainability. Terraform abstracts infrastructure
management into code, fostering agility and reducing time-to-market. Overall, the project
exemplifies modern cloud deployment practices for web applications.
Table of Contents
1. Introduction to AWS & Devops .................................................................................................... 1
1.1 AWS ....................................................................................................................................... 1
1.1.1 Advantages of AWS ........................................................................................................ 1
1.1.2 List of Services in AWS................................................................................................... 2
1.2 DEVOPS................................................................................................................................ 3
1.2.1 Advantage of Devops ...................................................................................................... 4
2. AWS & DEVOPS Tools Used .......................................................................................................... 6
2.1 Amazon S3 Storage ............................................................................................................... 6
2.2 Static Web Hosting ................................................................................................................ 9
2.3 Access Control List ............................................................................................................... 9
2.4 Terraform ............................................................................................................................ 10
3. Architecture ............................................................................................................................... 11
4. Implementation ......................................................................................................................... 12
5. OUTPUTS ................................................................................................................................... 20
1. Introduction to AWS & Devops

1.1 AWS
Amazon Web Services (AWS) revolutionized the tech industry by introducing a
comprehensive suite of cloud computing services that empower businesses to innovate and
scale rapidly. As a leading cloud provider, AWS offers a vast array of services, including
computing power, storage solutions, networking capabilities, and database management
tools, among others. Its pay-as-you-go pricing model and flexible infrastructure options
allow organizations to adapt quickly to changing business needs while optimizing costs.
With a global network of data centres, AWS ensures high availability and reliability,
enabling businesses of all sizes to leverage the power of the cloud to drive innovation and
growth.

In today's digital landscape, AWS plays a pivotal role in transforming how businesses
operate and deliver value to their customers. From startups to enterprise-level corporations,
organizations across industries rely on AWS to accelerate their digital transformation
journey. With its robust security features, compliance certifications, and extensive partner
ecosystem, AWS provides a secure and trusted platform for deploying mission-critical
applications and services. As businesses continue to embrace cloud computing as a strategic
imperative, AWS remains at the forefront, empowering enterprises to build scalable,
resilient, and agile solutions that drive business success in the modern era.

1.1.1 Advantages of AWS

1.Scalability and Flexibility: AWS provides elastic computing resources that can scale up
or down based on demand. This allows businesses to quickly respond to changing
requirements and avoid over-provisioning or under-provisioning of resources.

2.Cost-effectiveness: With AWS's pay-as-you-go pricing model, businesses only pay for
the resources they use, without any upfront investment. This flexibility helps optimize costs
and align expenses with actual usage.

1
3.Global Infrastructure: AWS has a vast global network of data centers spread across
multiple regions and availability zones. This enables businesses to deploy applications
closer to end-users for low-latency access and ensures high availability and fault tolerance.

4.Security and Compliance: AWS offers a wide range of security features and compliance
certifications to protect data and meet regulatory requirements. This includes encryption,
identity and access management (IAM), and compliance with standards like GDPR and
HIPAA.

5.Reliability and Performance: AWS's infrastructure is designed for high reliability and
performance, with industry-leading service level agreements (SLAs) for uptime and
availability. This ensures that businesses can deliver consistent and responsive experiences
to their customers.

6.Innovation and Ecosystem: AWS continuously innovates and introduces new services
and features to meet evolving business needs. Its extensive ecosystem of partners and
thirdparty integrations further enhances the platform's capabilities and enables businesses
to build complex and innovative solutions.

1.1.2 List of Services in AWS

Amazon, the preeminent cloud vendor, broke new ground by establishing the first cloud
computing service, Amazon EC2, in 2008. AWS offers more solutions and features than
any other provider and has free tiers with access to the AWS Console, where users
can
centrally control their ministrations.
Designed around ease-of-use for various skill sets, AWS is tailored for those
unaccustomed to software development utilities. Web applications can be deployed in
minutes with AWS

facilities, without provisioning servers or writing additional code.

o Amazon EC2 (Elastic Compute Cloud) o


Amazon RDS (Relational Database Services)
o Amazon S3 (Simple Storage Service)
o Amazon Lambda o Amazon

2
Cognito o Amazon Glacier o
Amazon SNS (Simple Notification Service) o
Amazon VPC (Virtual Private Cloud)
o Amazon Lightsail o Amazon CloudWatch
o Amazon Cloud9 o Amazon
Elastic Beanstalk o Amazon CodeCommit
o AmazonIAM(IdentityandAccessManagemen)

o Amazon Inspector o Amazon

Kinesis o Amazon Dynamo DB

o Amazon Codecatalyst o Amazon Kinesis o

AWS Athena o AWS Amplify o AWS

Quicksight

o AWS Cloud formation

1.2 DEVOPS

DevOps is a transformative approach to software development and IT operations that


emphasizes collaboration, automation, and continuous delivery to streamline the software
delivery lifecycle. It aims to break down silos between development and operations teams,
fostering a culture of shared responsibility and accountability for delivering high-quality
software efficiently. By integrating development, testing, deployment, and monitoring
processes into a unified workflow, DevOps enables organizations to accelerate time-
tomarket, improve product quality, and enhance overall business agility.

At its core, DevOps emphasizes automation to achieve faster and more reliable software
delivery. By leveraging automation tools for infrastructure provisioning, configuration
management, and deployment pipelines, DevOps teams can eliminate manual, error-prone
tasks and achieve consistent and repeatable processes. Additionally, DevOps encourages
the adoption of agile principles and practices, such as iterative development, continuous
integration, and feedback loops, to enable rapid iteration and innovation. Ultimately,
DevOps represents a cultural shift in how organizations approach software development
and operations, promoting collaboration, transparency, and a relentless focus on delivering
value to customers.

3
1.2.1 Advantage of Devops

Using DevOps practices offers several advantages for organizations:

1.Accelerated Delivery: DevOps enables faster delivery of software by automating


processes such as build, test, and deployment. This results in shorter development cycles,
allowing organizations to respond quickly to changing market demands and gain a
competitive edge.

2.Improved Collaboration: DevOps fosters collaboration and communication between


development, operations, and other stakeholders. By breaking down silos and promoting
cross-functional teams, DevOps encourages shared ownership and accountability, leading
to smoother and more efficient workflows.

3.Increased Efficiency: Automation plays a key role in DevOps, reducing manual effort
and minimizing errors. By automating repetitive tasks like configuration management,
infrastructure provisioning, and testing, organizations can streamline processes, increase
productivity, and reduce time-to-market.

4.Enhanced Quality: DevOps practices emphasize continuous integration and continuous


delivery (CI/CD), enabling frequent and incremental releases. This iterative approach to
software development allows teams to detect and address issues early in the development
cycle, leading to higher-quality software and improved customer satisfaction.

5.Scalability and Resilience: With DevOps, organizations can build scalable and resilient
systems by leveraging infrastructure as code (IaC) and cloud technologies. By treating
infrastructure as software, DevOps teams can automate the provisioning and scaling of
resources, ensuring that applications can handle fluctuating workloads and maintain high
availability.

6.Improved Feedback Loops: DevOps encourages the use of monitoring and feedback
mechanisms to gather insights into application performance and user behavior. By
collecting and analyzing data from production environments, teams can make informed
decisions, prioritize improvements, and continuously optimize their systems.

4
7.Cultural Transformation: Beyond technical practices, DevOps promotes a cultural shift
towards collaboration, experimentation, and continuous learning. By fostering a culture of
innovation and accountability, organizations can create a more adaptive and resilient
workforce capable of driving sustainable growth and innovation.

5
2. AWS & DEVOPS Tools Used

2.1 Amazon S3 Storage


Amazon Simple Storage Service (S3) is a massively scalable storage service based on object storage
technology. It provides a very high level of durability, with high availability and high performance.
Data can be accessed from anywhere via the Internet, through the Amazon Console and the
powerful S3 API.

S3 storage provides the following key features:


Buckets—data is stored in buckets. Each bucket can store an unlimited amount of unstructured
data.

Elastic scalability—S3 has no storage limit. Individual objects can be up to 5TB in size.

Flexible data structure—each object is identified using a unique key, and you can use metadata to
flexibly organize data.

Downloading data—easily share data with anyone inside or outside your organization and enable
them to download data over the Internet.

Permissions—assign permissions at the bucket or object level to ensure only authorized users can
access data.

APIs – the S3 API, provided both as REST and SOAP interfaces, has become an industry standard
and is integrated with a large number of existing tools.

5 Use Cases for S3 Storage:

1. Backup and Archival:


One of the primary use cases for S3 storage is backup and archival. Organizations can leverage
S3’s durability and availability to ensure the safety and longevity of their data. S3’s redundant
architecture and distributed data storage make it possible to store critical data that needs to be
accessed quickly and securely.

S3 also offers seamless integration with various backup and archival software. This allows
businesses to automate the backup and archival processes, reducing the risk of human error and
ensuring data is consistently protected. With S3’s versioning capabilities, organizations can also
retain multiple versions of their files, enabling roll back to previous versions if needed.

6
2. Content Distribution and Hosting:
By leveraging S3’s global network of edge locations, content creators can distribute their files
seamlessly to end-users, reducing latency and improving user experience. S3’s integration with
content delivery networks (CDNs) further enhances its content distribution capabilities, ensuring
that files are delivered quickly and efficiently.

Moreover, S3 storage is highly scalable, allowing businesses to handle high traffic spikes without
performance degradation. This makes it an ideal choice for hosting static websites, where content
is served directly from S3 buckets. With S3’s support for custom domain names and SSL
certificates, businesses can create a reliable and secure web hosting environment.

3. Disaster Recovery:
With S3’s cross-region replication, businesses can automatically save their data in multiple Amazon
regions, ensuring that it is protected against regional disasters. In the event of a disaster,
organizations can quickly restore their data from the replicated copies stored in S3, minimizing
downtime and data loss.

S3’s durability and availability also make it an excellent choice for storing backups of critical
systems and databases. By regularly backing up data to S3, organizations can quickly recover their
systems in the event of a failure, reducing the impact on business operations.

4. Big Data and Analytics:


S3’s low-cost storage object make it suitable for storing large volumes raw data.
Organizations can ingest data from various sources into S3, including log files, sensor data, and
social media feeds. S3’s integration with big data processing frameworks like Apache Hadoop and
Apache Spark enables businesses to process and analyze this data at scale.

Additionally, S3 supports data lake architectures, allowing organizations to store structured and
unstructured data in its native format. This reduces the need for data transformation, reducing
complexity and enabling faster data processing. S3 tightly integrates with Amazon’s big data
analytics services like Amazon Athena and Amazon Redshift.

5. Software and Object Distribution:


S3 is commonly used by organizations to distribute software packages, firmware updates, and other
digital assets to users, customers, or employees. S3’s global network of edge locations ensures fast
and efficient delivery of these files, regardless of the users’ location.

7
With S3’s support for access control policies and signed URLs, businesses can ensure that only
authorized users can access their distributed files. This provides an additional layer of security and
prevents unauthorized distribution or tampering of software packages.
Amazon S3 Storage Classes:

S3 provides storage tiers, also called storage classes, which can be applied at the bucket or object
level. S3 also provides lifecycle policies you can use to automatically move objects between tiers,
based on rules or thresholds you define.

The main storage classes are:

• Standard—for frequently accessed data

• Standard-IA—standard infrequent access

• One Zone-IA—one-zone infrequent access

• Intelligent-Tiering—automatically moves data to the most appropriate tier.

8
2.2 Static Web Hosting
Static web hosting involves serving web pages and content directly from a web server without the
need for server-side processing or dynamic content generation. It is ideal for websites with fixed
content that rarely changes, such as informational sites, portfolios, blogs, and documentation pages.
Static web hosting simplifies the infrastructure requirements as it only requires a web server to
serve the static files, typically HTML, CSS, JavaScript, images, and other media files. This
approach offers several advantages, including fast performance, low cost, high scalability, and
simplicity in deployment and maintenance. Static web hosting is often used in combination with
content delivery networks (CDNs) to further enhance performance and global availability by
caching content at edge locations closer to end-users.

Businesses and individuals commonly use static web hosting for a variety of purposes, including
hosting personal websites, company landing pages, product documentation, and marketing
websites. It is particularly well-suited for projects where simplicity, speed, and cost-effectiveness
are priorities. Static web hosting platforms and services, such as Amazon S3, GitHub Pages, Netlify,
and Firebase Hosting, offer easy-to-use tools and features for deploying, managing, and scaling
static websites, making them accessible to users of all skill levels without the need for extensive
server administration or development expertise.

2.3 Access Control List


ALC (Access Control List) refers to a mechanism used in computer systems to define permissions
and access rights for resources such as files, directories, and network resources. ALCs specify
which users or groups are granted access to specific resources and the type of access they are
allowed, such as read, write, or execute permissions. In cloud computing environments like
Amazon Web Services (AWS), ALCs are commonly used to control access to objects stored in
services like Amazon S3 buckets or to restrict network traffic in virtual private clouds (VPCs).
ALCs can be managed at both the resource level (e.g., setting permissions on individual files) and
the network level (e.g., defining rules for inbound and outbound traffic). Properly configuring
ALCs is essential for ensuring security and compliance in cloud environments while also enabling
efficient and controlled access to resources for authorized users and applications.

Terraform is an open-source infrastructure as code (IaC) tool that enables users to define and
provision infrastructure resources using a declarative configuration language. With Terraform,
infrastructure is described in configuration files, known as Terraform scripts, which specify the
desired state of the infrastructure. Terraform then automatically manages and orchestrates the

9
provisioning of resources across various cloud providers, data centers, and service providers,
ensuring consistency and repeatability in infrastructure deployment. Terraform supports a wide
range of cloud services and providers, including Amazon Web Services (AWS), Microsoft Azure,
Google Cloud Platform (GCP), and more, making it a versatile tool for managing infrastructure
across hybrid and multi-cloud environments.

2.4 Terraform
Terraform is an open-source infrastructure as code (IaC) tool that allows users to define and
provision infrastructure resources using a declarative configuration language. With Terraform,
infrastructure is described in configuration files, known as Terraform scripts, which specify the
desired state of the infrastructure. Terraform then automatically manages and orchestrates the
provisioning of resources across various cloud providers, data centers, and service providers,
ensuring consistency and repeatability in infrastructure deployment. Terraform supports a wide
range of cloud services and providers, including Amazon Web Services (AWS), Microsoft Azure,
Google Cloud Platform (GCP), and more, making it a versatile tool for managing infrastructure
across hybrid and multi-cloud environments.

Terraform follows a "write, plan, and apply" workflow, where users first define their infrastructure
configuration in Terraform scripts, then use Terraform commands to generate an execution plan and
apply the changes to the infrastructure. This workflow enables users to manage infrastructure
changes in a controlled and predictable manner, with Terraform providing features for dependency
management, resource tracking, and state management to ensure that changes are applied safely
and efficiently. Terraform's declarative approach to infrastructure management allows users to
focus on describing the desired end state of their infrastructure rather than worrying about the
underlying implementation details, resulting in simpler, more maintainable, and more predictable
infrastructure configurations.

Overall, Terraform simplifies and streamlines infrastructure management by enabling users to


define, provision, and manage infrastructure resources using code. It facilitates infrastructure as
code practices, enabling teams to automate infrastructure deployment, enforce configuration
consistency, and improve collaboration and efficiency across development, operations, and other
stakeholders. With Terraform, organizations can achieve faster time-tomarket, improved reliability,
and greater agility in deploying and managing their IT resources, while also reducing manual effort
and minimizing the risk of errors in infrastructure operations.

10
3. Architecture

11
4. Implementation

Step 1: Setting Up Providers (providers.tf)

In the providers.tf file, we specify the required providers for our Terraform configuration. In this
case, we require the AWS provider from HashiCorp with version 5.47.0.

The provider block configures the AWS provider with specific settings. Here, we specify the AWS
region as "ap-south-1" to deploy our resources in the Asia Pacific (Mumbai) region.

Step 2: Defining Variables (variables.tf)

In the variables.tf file, we define a variable named "bucketname" with a default value of
"mybucketstaticweb-hosting-2024".

Variables in Terraform allow us to parameterize our configuration, making it more flexible and
reusable across different environments.

12
Step 3: Creating AWS S3 Bucket and Objects (main.tf):

• In the main.tf file, we define resources to create an AWS S3 bucket (aws_s3_bucket) named
"mybucket" using the variable "bucketname".

Before running this command:

In AWS S3:

After running the command:

13
• We also configure ownership controls (aws_s3_bucket_ownership_controls) and public
access block settings (aws_s3_bucket_public_access_block) for the bucket to ensure
security and compliance.

• Additionally, we define an ACL (aws_s3_bucket_acl) to grant public read access to the


bucket contents.

• Finally, we upload static files (index.html, error.html, pro1.jpeg, proj2.jpg) to the S3 bucket
using aws_s3_object resources, setting appropriate ACLs and content types.

14
After running the command “terraform apply -auto-approve” the object (index.html, error.html,
pro1.jpeg, poej2.jpg) get uploaded into the bucket.

In AWS S3:

There ACL are also get changed like this:

15
Index.html:

Error.html:

16
Pro1.jpeg:

Proj2.jpg:

Step 4: Configuring S3 Bucket for Website Hosting:

We configure the S3 bucket for static website hosting using the

17
aws_s3_bucket_website_configuration resource. This includes specifying the index and error
documents for the website.

After running the “terraform apply -auto-approve” our static website will be created, output in
terminal after this step will be:

In AWS S3:

18
19
5. OUTPUTS

Our static website:

20

You might also like