Terraform Full Course
Terraform Full Course
Cost reduction
By removing the manual component, people are able to refocus their efforts towards other tasks.
Speed
IaC allows faster execution when configuring infrastructure and aims at providing visibility to help other teams across the enterprise work quickly and more efficiently.
Reduced risk
Automation removes the risk associated with human error, like manual misconfiguration; removing this can decrease downtime and increase reliability.
Test
Infrastructure as Code enables DevOps teams to test applications in production-like environments early in the development cycle.
Accountability
Since you can version IaC configuration files like any source code file, you have full traceability of the changes each configuration suffered.
Documentation
Not only does IaC automate the process, but it also serves as a form of documentation of the proper way to instantiate infrastructure and insurance in the case where
employees leave your company with institutional knowledge. Because code can be version-controlled, IaC allows every change to your server configuration to be documented,
logged, and tracked. And these configurations can be tested, just like code.
Enhanced security
If all compute, storage, and networking services are provisioned with code, then they are deployed the same way every time. This means that security standards can be easily
and consistently deployed across company without having to have a security gatekeeper review and approve every change.
What is Terraform?
Terraform is an infrastructure as code (IaC) tool that allows you to
build, change, and version infrastructure safely and efficiently. This
includes low-level components such as compute instances, storage,
and networking, as well as high-level components such as DNS
entries, SaaS features, etc. Terraform can manage both existing
service providers and custom in-house solutions.
3) Infrastructure as code
With Terraform, you can use code to manage and maintain resources. It allows you to store the infrastructure status, so that you can track the changes in
different components of the system (infrastructure as code) and share these configurations with others.
The terraform get command is used to download and update modules mentioned in the root module.
Important points:
1) The modules are downloaded into a .terraform subdirectory of the current working directory. Don't commit this directory to your version
control repository.
2)
3) The get command supports the following option:
● -update - If specified, modules that are already downloaded will be checked for updates and the updates will be downloaded if present.
● -no-color - Disable text coloring in the output.
terraform get downloads and update modules mentioned in the root module
terraform get -update=true modules already downloaded will be checked for updates and updated
Terraform Init
This command performs several different initialization steps in order to prepare the current working directory for use with Terraform.
Important points:
1) This command is always safe to run multiple times, to bring the working directory up to date with changes in the configuration.
2) Though subsequent runs may give errors, this command will never delete your existing configuration or state.
3) It initializes a working directory containing Terraform configuration files.
4) Performs backend initialization , storage for terraform state file, modules installation,
5) It download from terraform registry to local path, provider(s) plugins installation, the plugins are downloaded in the sub-directory of the
present working directory at the path of .terraform/plugins
6) Supports -upgrade to update all previously installed plugins to the newest version that complies with the configuration version
constraints,does not delete the existing configuration or state
This command check whether the execution plan for a configuration matches your expectations before provisioning or changing infrastructure.
The terraform validate command validates the configuration files in a directory, referring only to the configuration and not accessing any remote
services such as remote state, provider APIs, etc.
Validate runs checks that verify whether a configuration is syntactically valid and internally consistent, regardless of any provided variables or
existing state. It is thus primarily useful for general verification of reusable modules, including correctness of attribute names and value types.
Important points:
1) It validates syntactically for format and correctness.
2) It is used to validate/check the syntax of the Terraform files.
3) It verifies whether a configuration is syntactically valid and internally consistent, regardless of any provided variables or existing state.
4) A syntax check is done on all the terraform files in the directory, and will display an error if any of the files doesn’t validate.
Another way to use terraform apply is to pass it the filename of a saved plan file you created earlier with terraform plan -out=..., in which case Terraform will apply the changes
in the plan without any confirmation prompt. This two-step workflow is primarily intended for when running Terraform in automation.
Important Points
● If a resource successfully creates but fails during provisioning,
○ Terraform will error and mark the resource as “tainted”.
● It apply changes to reach the desired state.
○ A resource that is tainted has been physically created, but can’t be considered
● It scans the current directory for the configuration and applies the
safe to use since provisioning failed.
changes appropriately.
○ Terraform also does not automatically roll back and destroy the resource during
● It can be provided with a explicit plan, saved as out from terraform
the apply when the failure happens, because that would go against the execution
plan
plan: the execution plan would’ve said a resource will be created, but does not say
● If no explicit plan file is given on the command line, terraform apply
it will ever be deleted.
will create a new plan automatically and prompt for approval to
● It does not import any resource.
apply it
● It supports -auto-approve to apply the changes without asking for a confirmation
● It will modify the infrastructure and the state.
● It supports -target to apply a specific module
Terraform updates state automatically during plans and applies. However, it's sometimes necessary to make deliberate adjustments to
Terraform's state data, usually to compensate for changes to the configuration or the real managed infrastructure.
Important Points:
● State helps keep track of the infrastructure Terraform manages
● It maps real world-world resources to terraform configuration
● It stored locally in the terraform.tfstate or can be stored in remote server
such as Terraform Cloud, HashiCorp Consul, Amazon S3, Azure Blob
Storage, Google Cloud Storage, Alibaba Cloud OSS
● It is recommended not to edit the state manually
Important Points
● It helps manage multiple distinct sets of infrastructure resources or environments with the same code.
● It just need to create needed workspace and use them, instead of creating a directory for each environment to manage
● state files for each workspace are stored in the directory terraform.tfstate.d
● terraform workspace new dev creates a new workspace and switches to it as well
● terraform workspace select dev helps select workspace
● terraform workspace list lists the workspaces and shows the current active one with *
● does not provide strong separation as it uses the same backend
While you will typically not want to destroy long-lived objects in a production environment, Terraform is sometimes used to manage ephemeral
infrastructure for development purposes, in which case you can use terraform destroy to conveniently clean up all of those temporary objects
once you are finished with your work.
Important Points
terraform destroy Destroy all remote objects managed by a particular Terraform configuration
terraform destroy –auto-approve Destroy/cleanup without being prompted to enter ”yes”
terraform destroy -target Only destroy the targeted resource and its dependencies
Terraform Import
The terraform import command is used to import existing resources into Terraform.
ADDRESS must be a valid resource address. Because any resource address is valid, the import command can import resources into modules
as well as directly into the root of your state.
.
Important Points
● It helps import already-existing external resources, not managed by Terraform, into Terraform state and allow it to manage those
resources
● Terraform is not able to auto-generate configurations for those imported modules, for now, and requires you to first write the resource
definition in Terraform and then import this resource
terraform import aws_instance.foo i-abcd1234 import an AWS instance with ID i-abcd1234 into aws_instance resource named “foo”
terraform import module.foo.aws_instance.bar i-abcd1234 import an AWS instance into the aws_instance resource named bar into a module
named foo
For More examples click here
Terraform Taint
The terraform taint command informs Terraform that a particular object has become degraded or damaged. Terraform represents this by
marking the object as "tainted" in the Terraform state, in which case Terraform will propose to replace it in the next plan you create.
Note: It’s depreciated , now it’s recommended to use -replace option with terraform apply e.g.
terraform apply -replace="aws_instance.example[0]"
Important Points
● It marks a Terraform-managed resource as tainted, forcing it to be destroyed and recreated on the next apply.
● It will not modify infrastructure, but does modify the state file in order to mark a resource as tainted. Infrastructure and state are
changed in next apply.
● It can be used to taint a resource within a module
terraform taint aws_instance.my_ec2 marks a Terraform-managed resource as tainted, forcing it to be destroyed and recreated on the next apply
terraform untaint aws_instance.my_ec2 unmarks a Terraform-managed resource as tainted
Terraform Graph
The terraform graph command is used to generate a visual representation of either a configuration or execution plan. The output is in the DOT
format, which can be used by GraphViz to generate charts.
The graph is outputted in DOT format. The typical program that can read this format is GraphViz, but many web services are also available to
read this format.
terraform graph | dot -Tsvg > graph.svg terraform graph creates a resource graph listing all resources in your configuration and their
dependencies.
Terraform Coding
main.tf, variables.tf, outputs.tf. These are the recommended filenames to create infrastructure as code using terraform, main.tf should be the primary
entrypoint.
Typically resource creation may be split into multiple files but any nested module calls should be in the main file. variables.tf and outputs.tf should
contain the declarations for variables and outputs, respectively
Terraform version:
To check terraform version we can run command: terraform -version
In terraform main file i.e. main.tf e.g.
terraform {
required_version = ">= 0.12"
}
Resource Declaration:
resource "aws_vpc" "main" {
cidr_block = var.base_cidr_block
}
● Resource are the most important element in the Terraform language
<BLOCK TYPE> "<BLOCK LABEL>" "<BLOCK LABEL>" { that describes one or more infrastructure objects, such as compute
# Block body instances etc
<IDENTIFIER> = <EXPRESSION> # Argument
} ● Resource type and local name together serve as an identifier for a
“resource” = Reserved Keyword given resource and must be unique within a module for e.g.
“aws_vpc” = Resource provide by Terraform Provider aws_instance.local_name
“main” = User provided arbitrary resource name
Terraform Coding
Data Sources
Data sources allow Terraform use information defined outside of Terraform, defined by another separate Terraform configuration, or modified by
functions.
In addition to resources, data sources are another important component of Terraform projects. Data sources can be used to read values from existing
services in the target platform.
E.g.
data "aws_ami" "example" {
most_recent = true
owners = ["self"]
tags = {
Name = "app-server"
Tested = "true"
}
}
And then we can use the data source by referencing as data.aws_ami.example.id, more practical example:
most_recent = true
}
Terraform Coding
Variables
Terraform projects can be made parameterisable with variables. Variables can have a default value. They are referenced in the project via their name.
The Terraform language includes a few kinds of blocks for requesting or publishing named values.
● Input Variables serve as parameters for a Terraform module, so users can customize behavior without editing the source.
● Output Values are like return values for a Terraform module.
● Local Values are a convenience feature for assigning a short name to an expression.
Input Variable
Input variables serve as parameters for a Terraform module, allowing aspects of the Within the module that declared a variable, its value can be accessed
module to be customized without altering the module's own source code, and from within expressions as var.<NAME>, where <NAME> matches the
allowing modules to be shared between different configurations. label given in the declaration block:
Each input variable accepted by a module must be declared using a variable block: resource "aws_instance" "example" {
variable "image_id" { instance_type = "t2.micro"
type = string ami = var.image_id
description = "The id of the machine image (AMI) to use for the server." }
}
The type constructors allow you to specify complex types such as collections:
Types ● list(<TYPE>): a sequence of values identified by consecutive whole
The type argument in a variable block allows you to restrict the type of value numbers starting with zero
that will be accepted as the value for a variable. If no type constraint is set ● set(<TYPE>): a collection of values where each is identified by a string
then a value of any type is accepted. label.
The supported type keywords are: ● map(<TYPE>): a collection of unique values that do not have any
● string secondary identifiers or ordering.
● number ● object({<ATTR NAME> = <TYPE>, ... }): a collection of named
● bool attributes that each have their own type
● tuple([<TYPE>, ...]): a sequence of elements identified by consecutive
whole numbers starting with zero, where each element has its own type
Terraform Coding
Ways to pass variables
Variable Definition Precedence A variable definitions file uses the same basic syntax as Terraform
language files, but consists only of variable name assignments:
Terraform loads variables in the following order, with later sources taking precedence
over earlier ones: image_id = "ami-abc123"
availability_zone_names = [
"us-east-1a",
● Environment variables
"us-west-1c",
● The terraform.tfvars file, if present. ]
● The terraform.tfvars.json file, if present.
● Any *.auto.tfvars or *.auto.tfvars.json files, processed in lexical order of their
filenames. Via environment Variable
● Any -var and -var-file options on the command line, in the order they are Values for variables can also be defined as environment
provided. (This includes variables set by a Terraform Cloud workspace.) variables. The name of the Terraform variable has to be
provided with the prefix TF_VAR_ .
E.g: export TF_VAR_image_id=ami-abc123
Terraform Coding
Variables input validation, custom validation rules
In addition to Type Constraints as described last slide, a module author can specify arbitrary custom validation rules for a particular variable using a
validation block nested within the corresponding variable block:
variable "image_id" {
type = string String Interpolation
description = "The id of the machine image (AMI) to use for the server."
String interpolation is an integral part of the HCL.
validation { Variables can be used via $ {} in strings
condition = length(var.image_id) > 4 && substr(var.image_id, 0, 4) == "ami-" E.g.
error_message = "The image_id value must be a valid AMI id, starting with \"ami-\"." variable "resource_prefix" {
} type = string default = "example"
} }
● A child module can use outputs to expose a subset of its resource attributes to a parent module.
● A root module can use outputs to print certain values in the CLI output after running terraform apply.
● When using remote state, root module outputs can be accessed by other configurations via a terraform_remote_state data source
Important points:
● are like function return values.
● output can be marked as containing sensitive material using the optional sensitive argument, which prevents Terraform from showing its
value in the list of outputs. However, they are still stored in the state as plain text.
● In a parent module, outputs of child modules are available in expressions as module.<MODULE NAME>.<OUTPUT NAME>.
E.g.
output "instance_ip_addr" {
value = aws_instance.server.private_ip
description = "The private IP address of the main server instance."
}
An output can be marked as containing sensitive material using the optional sensitive argument:
output "db_password" {
value = aws_db_instance.db.password
description = "The password for logging in to the database."
sensitive = true
}
Terraform Coding
Local Values
A local value assigns a name to an expression, so you can use it multiple times within a module without repeating it.
In short: Local values are like a function's temporary local variables.
Important Points:
● locals assigns a name to an expression, allowing it to be used multiple times within a module without repeating it.
● are like a function’s temporary local variables.
● helps to avoid repeating the same values or expressions multiple times in a configuration.
Example:
locals {
service_name = "forum" Use like below:
owner = "Community Team" resource "aws_instance" "example" {
} # ...
Or
tags = local.common_tags
}
locals {
locals {
# Common tags to be assigned to all resources
common_tags = {
Service = local.service_name
Owner = local.owner
}
}
Terraform Coding
Providers
Terraform relies on plugins called "providers" to interact with cloud providers, SaaS providers, and other APIs.
Terraform configurations must declare which providers they require so that Terraform can install and use them. Additionally, some providers require
configuration (like endpoint URLs or cloud regions) before they can be used.
As per Terraform, Provider is responsible for understanding API interactions and exposing resources.
Of Course It support: Alibaba Cloud, AWS, GCP, Microsoft Azure, here these are treated as A Provider.
Here is the details list of Providers (Providers get added / updated time to time):
ACME, Akamai, Alibaba Cloud, Archive, Arukas, Avi Vantage, Aviatrix, AWS, Azure, Azure Active Directory, Azure Stack, A10, Networks, Bitbucket,
Brightbox, CenturyLinkCloud, Chef, CherryServers, Circonus, Cisco ASA, Cisco ACI, Cloudflare,CloudScale.ch, CloudStack, Cobbler, Consul, Datadog,
DigitalOcean, DNS, DNSimple, DNSMadeEasy, Docker, , ,Dome9 ,Dyn,Exoscale ,External,F5 BIG-IP,Fastly,FlexibleEngine, FortiOS, Genymotion,
GitHub, GitLab, Google Cloud Platform, Grafana, Gridscale, Hedvig, Helm, Heroku, Hetzner Cloud, HTTP, HuaweiCloud, HuaweiCloudStack, Icinga2,
Ignition, InfluxDB, JDCloud, Kubernetes, LaunchDarkly, Librato, Linode, Local, Logentries, LogicMonitor, Mailgun, MongoDB Atlas, MySQL, Naver
Cloud, Netlify, New Relic, Nomad, NS1, Null, Nutanix, 1&1, OpenNebula, OpenStack, OpenTelekomCloud, OpsGenie, Oracle Cloud Infrastructure,
Oracle Cloud Platform, Oracle Public Cloud, OVH, Packet, PagerDuty, Palo Alto Networks, PostgreSQL, PowerDNS, ProfitBricks, Pureport, RabbitMQ,
RancherRancher2Random, , RightScale, Rundeck, RunScope, Scaleway, Selectel, SignalFx, Skytap, SoftLayer, Spotinst, StatusCake,
TelefonicaOpenCloud, Template, TencentCloud, Terraform, Terraform Cloud, TLS, Triton, UCloud, UltraDNS, Vault, Venafi, VMware NSX-T, VMware
vCloud Director, VMware vRA7, VMware vSphere, Vultr, Yandex
Then use like:
E.g. # Create a VPC to launch our instances into
resource "aws_vpc" "default" {
provider "aws" {
cidr_block = "10.0.0.0/16"
region = var.aws_region
}
}
Terraform Coding
In build Functions
The Terraform language includes a number of built-in functions that you can call from within expressions to transform and combine values. The general
syntax for function calls is a function name followed by comma-separated arguments in parentheses:
max(5, 12, 9)
Numeric Functions : abs, ceil, floor, log, max, min, parseint, pow, signum
String Functions: chomp, format, formalist, join, lower, regex, regexall, rep[lace, split, strrev, title, trim, trimprefix, trimsuffix, trimspace, upper
Collection Functions: alltrue, anytrue, chunklist, coalesce, coalescelist, compact, concat, contains, distinct, element, flatten, index, keys, length, list,
lookup, map, matchkeys,merge, one, range, reverse, setintersection, setproduct, setsubsctract, setunion, slice, sort, sum, transpose, values, zipmap
Encoding Functions: base643ncode, base64decode, base64gzip, csvdecode, jsonencode, jsondecode, urlencode, yamlencode, yamldecode
Filesystem functions: absath, dirname, pathexpand, basename, file, fileexists, fileset, filebase64, templatefile
Hash and Crypto Functions : base64sha256, base64sha512, bcrypt, filebase64sha512, filemd5, filesha1, filesha256, filesha512, md5, rsadecrypt,
sha, sha256, sha512, uuid, uuidv5
Type Conversion functions: can, defaults, nonsensative, sensitive, tobool, tolist, tomap, tonumber, toset, tostring, try
Terraform Coding
Modules
Root module example
A Terraform module is a set of Terraform configuration files in a single directory. Even a
simple configuration consisting of a single directory with one or more .tf files is a module.
When you run Terraform commands directly from such a directory, it is considered the root
module. So in this sense, every Terraform configuration is part of a module. You may have
a simple set of Terraform configuration files such as:
Modules are containers for multiple resources that are used together. A module consists of
a collection of .tf and/or .tf.json files kept together in a directory. Child Modules
Modules are the main way to package and reuse resource configurations with Terraform.
With modules, reusable components can be created. Each Terraform project is a valid
Terraform module. In addition, modules can be created in sub-folders within a project.
Variables serve as the configuration interface of modules. For each module to be used, a
module block must be defined in the project.
}
}
Terraform Coding
Life Cycle Properly / meta argument:
The behaviour of Terraform in the context of a resource can be influenced with meta arguments. The lifecycle property offers three important levers to
individualise the Terraform life cycle of a resource. E.g.
If Terraform projects are supervised by several people or are used by systems such as CI/CD servers, the status (Terraform State) must be kept
centrally. This ensures that no congruent writing operations are in progress and that each execution can use the latest state. So-called remote
state backends allow the Terraform State to be consumed by remote systems.
Terraform uses persistent state data to keep track of the resources it manages. Since it needs the state in order to know which real-world
infrastructure objects correspond to the resources in a configuration, everyone working with a given collection of infrastructure resources must
be able to access the same state data.
The local backend stores state as a local file on disk, but every other backend stores state in a remote service of some kind, which allows
multiple people to access it. Accessing state in a remote service generally requires some kind of access credentials, since state data contains
extremely sensitive information.
Some backends act like plain "remote disks" for state files; others support locking the state while operations are being performed, which helps
prevent conflicts and inconsistencies.
Options are: artifactory, azurerm, consul, cos, etcd, etdcv3, gcs, http, kubernetes, manta, oss, pg, s3, swift
With Azure
With AWS S3 Stores the state as a Blob with the given Key within the Blob Container
Stores the state as a given key in a given bucket on Amazon S3. This backend also within the Blob Storage Account. This backend also supports state
supports state locking and consistency checking via Dynamo DB, which can be locking and consistency checking via native capabilities of Azure Blob
enabled by setting the dynamodb_table field to an existing DynamoDB table name. Storage.
Important points:
● It includes all the Terraform Cloud features as described in previous slide
● It supports detailed audit logging and tracks the identity of the user requesting state and maintains a history of state changes.
● SAML for SSO provides the ability to govern user access to your applications.
A Terraform Enterprise install that is provisioned on a Terraform Enterprise currently supports running under the following
operating systems for a Clustered deployment:
network that does not have Internet access is
○ Ubuntu 16.04.3 – 16.04.5 / 18.04
generally known as an air-gapped install. These
○ Red Hat Enterprise Linux 7.4 through 7.7
types of installs require you to pull updates,
○ CentOS 7.4 – 7.7
providers, etc. from external sources vs. being able ○ Amazon Linux
to download them directly. ○ Oracle Linux
○ Clusters currently don’t support other Linux
variants
Good luck!
I hope you’ll use this knowledge and build
awesome solutions.