cdk workshop
cdk workshop
AWS Cloud Development Kit (CDK) is a framework for writing your infrastructure as
code. At the end of this workshop you learn about using CDK internally at Amazon,
including how to use CDK pipelines, various constructs using TypeScript, basic CDK
commands, how to secure CDK constructs and how to test CDK infrastructure. You will
also get to create a state machine that will call a Lambda, do simple arithmetic on
a list of numbers, save the result in a DynamoDB table, and email it to you.
We will be using Typescript as the CDK language of choice to create and manage our
infrastructure. Builder Tools’ Golden path architecture recommendation is to use
the TypeScript version of the CDK, and some constructs, such as those provided by
MoteCDK, are only vended in TypeScript. You will spend more time learning
CDK/CloudFormation rather than a new language. In addition, it will be easier to
find examples in TypeScript since the majority of CDK users choose TypeScript to
build their applications.
Who is this workshop for? Amazon SDEs who want to learn about using CDK internally
at Amazon, including how to use CDK pipelines, various constructs using typescript,
basic CDK commands, how to secure CDK constructs and how to test CDK
infrastructure.
Note
The section about Mote CDK is currently only approved for CDO usage and not for AWS
teams.
What use-cases is this workshop good for? (1) coordinate multiple AWS services into
serverless workflows using step functions (2) Event-driven architectures: when you
need to process AWS triggers (S3 event, SQS message, DynamoDB Stream, etc) that
make calls to other AWS services or internal Amazon services.
What should you know or do before you start? Create or use existing Conduit,
Isengard or Burner account (with Burner account, plan to complete the entire
workshop while the account is still active).
What will you learn? You will understand the basics of CDK with TypeScript: what
are CDK stacks, how to add AWS resources to your stacks. You will also learn and
use a number of CDK related brazil-build commands.
In this step we use BuilderHub Create to create pipeline, Java and CDK packages.
Then we pull source code to our dev machine and build the workspace. We will
predominantly use CDK package in all the steps, but it’s still helpful for you to
pull all packages locally.
Provide an account name and check the first checkbox to acknowledge that usage is
not free.
The account may take 5-15 minutes to create. Note the account ID.
Important
Throughout this lab, you will be asked to use your Amazon username as part of a
command or resource name. Replace all instances of username with your own username,
following the capitalization of the text (e.g. username or Username).
Details Section
Clone Name: This can be whatever you want, but to make following along with this
workshop easier, we recommend calling it CdkWorkshopUsername
Owning Team: If you don’t already have one, we recommend creating a Team and POSIX
group called username-only just for yourself to use for personal projects.
To do this:
Go to Create a team .
For Secondary Owner, use the username of someone on your team, for instance your
onboarding buddy or your manager.
For Additional Members, enter your username and add yourself, and add any reason of
your choice in Add Override Reason.
After your team has been created, follow the instructions in the step-by-step guide
to attach a POSIX group with the same name (username-only) to the team. Waiting
time for the new team and POSIX group to show up in the CreateHub dropdown can be
minutes or even hours!
Posix group owner: The username-only POSIX group you created above. This may take a
little bit of time to appear after creation. If you’re still having issues finding
the group after waiting a bit and reloading the page, go back to your team and look
under Members to make sure you are included in the team membership. If not, add
yourself at the Additional Membership tab under Membership Policy and then retry.
If you have an existing AWS Account (for example a Burner account) that you want to
use, disable the Create an AWS account toggle and enter the AWS Account ID. If you
want to create a brand new AWS Account just for this lab, you can leave the toggle
enabled and select Individual for the account type.
Bindles Section
Disable Create a Bindle with the required permissions and under Select a Bindle
search for PersonalSoftwareBindle. There should be one with your username which you
can use to manage permissions for all your personal projects.
We recommend leaving all these settings alone, unless you have a good reason to
change them.
Once all the sections are filled out, click Next. If this is your first time using
BuilderHub Create with a personal bindle, you may get a warning like Bindle does
not grant Can Manage permissions to the service AWS Account. Just click the Grant
permissions link under the warning to add these permissions to your Bindle and then
come back and click Next again.
Confirm the settings and click Create!.
After around 50 minutes, you will receive an email that your application was
created. Get a hot drink and meditate instead of staring at the screen . Or, you
can check out one of the modules in our SDE Foundations course , if you’re new to
Amazon, or one of the other learning resources in the Engineering Excellence
catalog .
Verify
Your new Lambda is a simple calculator which adds two numbers. Via your pipeline,
it has been deployed to the us-west-2 Oregon AWS Region. Let’s test it out!
First, get programmatic access to the AWS account you used by using the ADA CLI:
aws --version
If the returned version for aws-cli is 1.x.xx, see AWS CLI version 2 migration
instructions in the AWS Command Line Interface User Guide for Version 2 to upgrade
to version 2.
You can use the same command and remove the --cli-binary-format raw-in-base64-out
flag to unblock yourself for now.
Now, while you may not be very mind-blown by this new capability, we’ll next look
at how this Lambda can be plugged-in to a nearly infinite number of possibilities
using AWS services and CDK.
The CDK package that BuilderHub Create provisioned for you has created a full CI/CD
setup (we call it a Pipeline). Open the following URL to see your pipeline:
https://pipelines.amazon.com/pipelines/CdkWorkshopUsername
This CDK pipeline has deployed your initial calculator Lambda code using
CloudFormation Stacks. You can view your code using the following link:
https://code.amazon.com/packages/CdkWorkshopUsername/trees/mainline
In the CloudFormation world, a stack is a collection of AWS resources managed
together. You can create CloudFormation stacks to define AWS resources using YAML
or JSON,or you can use CDK to define the Stacks programmatically, which is what
we’ll be exploring in this workshop.
https://code.amazon.com/packages/CdkWorkshopUsernameCDK/trees/mainline
If you expand the lib folder, you’ll see three files: app.ts, serviceStack.ts, and
monitoringStack.ts which were created for you by BuilderHub Create. The app.ts file
is the main entry point for CDK, which defines all the infrastructure the package
will create, while the other two files define CloudFormation stacks to be deployed
by the CDK. Feel free to explore this boilerplate code to familiarize yourself with
it.
Here’s a diagram of all the Stacks you have so far and the resources which exist in
each one. In the next part of this workshop, you will learn what goes into writing
a stack as you will create another stack from scratch yourself, stay tuned!
The command from BuilderHub is similar to this. Replace Username with your Amazon
username.
cd ~/workplace/CdkWorkshopUsername/
brazil ws use \
-vs CdkWorkshopUsername/development \
-p CdkWorkshopUsernameCDK \
-p CdkWorkshopUsername \
-p CdkWorkshopUsernameTests \
--platform AL2_x86_64
cd src/CdkWorkshopUsernameCDK
Build
From within the CdkWorkshopUsernameCDK directory, run the following command to
build all of the packages:
Note
If you run into ERROR: You have no preference setting for {Node {VERSION}.x. | Java
{VERSION}}, try brazil setup --{node | java} then cycle through the versions by
repeatedly pressing Enter and typing in the home directories under the correct
node/java versions (usually /opt/homebrew/opt/node@VERSION/bin/node and
/Library/Java/JavaVirtualMachines/amazon-corretto-VERSION.jdk/Contents/Home). If
you made a mistake, use brazil prefs --delete --force --global --key
{cli.bin.node18x} or {cli.bin.java14_home} to delete the pref settings. If you are
using nvm (node version manager) then you can locate node versions (after they’re
installed, e.g., nvm install 14) with the command nvm which XX replacing XX with
the version (ex. 14, 16, 18) and use the outputted file path for brazil setup --
node inputs. Here is a helpful answer in Sage .
Note
If you run into Task: coverageReport FAILED, try installing Java 11 from Self
Service and then run brazil ws clean before attempting to install the packages
again.
Note
If you run into The security token included in the request is expired Error: Failed
to run CDK CLI despite being authenticated, try running brazil ws clean && rm
~/.aws/credentials && touch ~/.aws/credentials. For more information. see this
answer in Sage .
Note
If you run into Unsupported class file major version 62 or this error:
BONESBootstrap-111222333-111222333444555-us-west-2`
Pipeline/PDGScaffoldingStack
Pipeline/PipelineDeploymentStack
CdkWorkshop-Service-alpha
CdkWorkshopUsername-Monitoring-alpha
You’ll notice your Service and Monitoring stacks we saw earlier are there, as well
as your pipeline and some bootstrap resources. Nice!
In this step, we will be transforming our simple calculator Lambda into a system
which spans multiple AWS services for even greater functionality. To deploy this
cleanly without disrupting anything we already have working, we’ll create it in a
new stack that we’ll call infraStack.
Tip
If you use vim, :set paste can be helpful to maintain formatting when copying
this code into your terminal. You can also use gg=G to format your document.
// DynamoDB Setup
this.table = new Table(this, 'Table', {
partitionKey: { name: 'id', type: AttributeType.STRING }
});
// SNS Topic
this.topic = new Topic(this, 'Topic', {
displayName: 'VerificationTopic'
});
// Subscribe to Email
const subscription = new Subscription(this, 'EmailSubscription', {
topic: this.topic,
protocol: SubscriptionProtocol.EMAIL,
endpoint: '[email protected]' // replace with your email address
});
}
}
Initialize the infra stack
Now, just like any programming project, creating a new file and writing some code
in it doesn’t actually do anything on its own. As we mentioned before, the app.ts
file is the entry point for CDK to build all the stacks you’ve defined, so we need
to add a bit of code to that file to wire up our new infra stack. Let’s make the
following changes to lib/app.ts:
Add SoftwareType to the list of @amzn/pipelines imports at the top of the file:
import {
DeploymentPipeline,
GordianKnotScannerApprovalWorkflowStep,
Platform,
ScanProfile,
SoftwareType,
} from '@amzn/pipelines';
Import the new stack at the top of the file, just below where MonitoringStack was
imported:
// import the new stack at the top of the file, just below where MonitoringStack
was imported
import { InfraStack } from './infraStack';
This pulls in the new code we just wrote for us to use it in our application.
Put this code block near the middle of the file (be sure to replace Username with
your own user name), before the line which initializes serviceStack (const
serviceStack = new ServiceStack...):
const deploymentProps = {
env: env,
softwareType: SoftwareType.INFRASTRUCTURE,
};
const infraStack = new InfraStack(app, `CdkWorkshopUsername-Infra-${stageName}`,
deploymentProps);
By initializing the InfraStack as a Typescript object, compiling the code will now
result in the InfraStack getting generated into your CloudFormation templates.
Once done, we can re-run the command from before to show all the CloudFormation
Stacks available from our CDK code:
BONESBootstrap-111222333-111222333444555-us-west-2
Pipeline/PDGScaffoldingStack
Pipeline/PipelineDeploymentStack
CdkWorkshopUsername-Infra-alpha
CdkWorkshopUsername-Service-alpha
CdkWorkshopUsername-Monitoring-alpha
If you want to take it one step further, you can use CDK diffing functionality to
do a comparison of your changes against what’s currently deployed:
But, pushing to mainline without some prior validation that your changes work is
usually a recipe for failures, rollbacks, and reverts. Instead, we’ll deploy this
new stack first manually to verify it does what we want. To do this outside of the
pipeline, we’ll need CDK to set up a few extra resources for us by running the
following command to bootstrap our AWS account:
Open the AWS console, search for CloudFormation, and change the console region to
US West (Oregon).
Double-check your infraStack.ts file to ensure that you have replaced all Username
strings with your actual username. If you modify the file, re-run brazil-build to
rebuild it.
Now we know it’s safe to commit and push your infrastructure changes remotely. You
can also create a CR to have the changes reviewed before pushing the changes:
Our step function will be making decisions based on the user’s input and the data
stored in DynamoDB to implement a caching mechanism for our Lambda function. After
all, math doesn’t change all that often, so we don’t need to make our Lambda do all
the incredibly hard work to add the same two numbers together over and over again.
The image above shows the State Machine Graph inspector which is part of the AWS
Console. We’ll use it to test and troubleshoot the serverless state machine.
Add state machine
Just as before, we’ll add this new step function as a new CloudFormation stack.
Create a new file stepFunctionStack.ts alongside your other stacks in the lib/
directory, and populate it with the following code:
map.iterator(checkAndCalculateWorkflow);
Just as before, we’ll need to wire up this stack to our main application to
register it for creation and deployment in our pipeline. This time, though, we’ll
be using some outputs of our previously-created infra and service stacks as
properties (ie. inputs) for our step function stack, so that we reference those
resources in our workflow. In app.ts, import your new stack at the top:
import { StepFunctionStack } from "./stepFunctionStack";
And then initialize your new stack like so (be sure to replace Username with your
own user name):
By the way, those three dots in ...deploymentProps, aren’t a typo; you should
really include them in your code. That’s the JavaScript Spread syntax.
And, finally, just like before you’ll want to add stepFunctionStack to the
deploymentGroup stack list so that it gets updated by your pipeline:
brazil-build
brazil-build cdk deploy CdkWorkshopUsername-StepFunction-alpha --require-approval
never
We’re using --require-approval never in this command to avoid CDK prompting us to
confirm changes to the IAM Role’s policy statements, which it does for us
automatically with this change to allow AWS Step Functions to invoke your Lambda,
send SNS messages, and access DynamoDB. If you’re curious what this looks like,
just omit the flag and see what CDK does when you deploy.
Note
If you get the error The security token included in the request is expired, try to
rerun:
If you get the error [100%] fail: spawn bats ENOENT while deploying, be sure that
the BATS CLI is installed on your local system with the following command: toolbox
install batscli
If you get an error during cdk deploy about Zipping this asset produced an empty
zip file..., you can go to the CdkWorkshopUsername package and run brazil-build
release to generate a zip file. Or, refer to this Sage post to use BATS to
manually transform your Lambda. Afterwards, rerun the cdk deploy command.
After your stack successfully deploys, you should get an AWS Notification -
Subscription Confirmation email asking you to confirm that you want to subscribe to
the SNS notification topic that you just created. Click Confirm subscription to
finish subscribing to the topic so that you receive notifications in the next
section! If you didn’t receive any emails, double check that your email address is
correct in infraStack.ts; if not, fix that and redeploy your stacks.
Verify
It’s time to test our Step Function! Just like our Lambda, we can invoke this
programmatically using the AWS CLI. However, a much richer experience for
visualizing, debugging, and understanding Step Functions is to invoke them using
the browser-based AWS Console.
If you’re using an Isengard account, go to the Console Access page and click on
the Admin role link.
From the AWS Console, use the region dropdown in the upper right-hand corner to
select US West (Oregon) us-west-2.
Use the Search field to find and select the Step Functions service.
{
"request":[
{
"number1":"5",
"number2":"6"
},
{
"number1":"5",
"number2":"3"
}
]
}
click Start execution again.
After a few seconds the State Machine should turn green. You should also see 2
emails (one for each invocation) with the output of your computations!
Note
If you didn’t receive any emails, double check that your email address is correct
in infraStack.ts. Also verify that you have confirmed your subscription to the SNS
topic. Go to the first email you received from SNS (it should be titled AWS
Notification - Subscription Confirmation) and click the confirmation link within
it. Then re-run your step function.
This is because for this invocation of the Step Function, we had no stored value
for this computation in DynamoDB. As a result, the checkState step at the top chose
the left-most path through the state machine, which invokes the Lambda function to
compute the value, and then stores that value in DynamoDB before sending it to you
via SNS.
Hint
Try checking out the DynamoDB console to see what data got added to your Table as a
result of this execution!
If you’re seeing where this is going, this means that the state of the step
function should be different the next time we invoke it with the same value. Go
ahead, give it a try! If you start another execution and paste the same value as
before, you’ll see a different path taken through the step function:
You should also notice that this execution goes much faster than the one before. As
it turns out, retrieving an item from DynamoDB is orders of magnitude quicker (and
cheaper) than invoking a Lambda function and waiting for its completion. This
allows us to effectively use DynamoDB as a cache to avoid expensive (both in time
and in dollars) Lambda invocations.
Try out some more inputs to see what happens! You can also click on each state in
the step function diagram to see the input and the output of each state, and follow
the execution along from the beginning to end. If you play around with the inputs,
you may discover a couple shortcomings of this state machine. If you’re up for a
challenge, see if you can modify the step function code to deal with one or more of
the following issues:
Try passing in input which does not have the number1 or number2 fields at all. What
happens? Could we improve this in a way that lets the customer know they’ve done
something wrong and how to correct it?
Try passing in the reverse of your input. For example if you cached the result of 5
+ 6 before, try 6 + 5. Is the result still cached? How could we modify the DynamoDB
storage and lookup to fix this?
Tip
When modifying or debugging a step function, you can use the AWS Console to make
small changes until it works, instead of deploying the stack for each small change.
Step functions are a fantastic entry into the possibilities of building on AWS, and
there’s much more that can be done with them than what we covered in this lab. Here
are some resources if you want to dive deeper:
A fantastic talk from Chris Munns, Principal Developer Advocate for Serverless, on
adding business logic to step functions: https://www.youtube.com/watch?
v=c797gM0f_Pc
Next
Day 2: Secure your app
You will have enhanced your app from Day 1 to be more secure, using best practices
of Amazon InfoSec.
Introduction
Here at Amazon, our job isn’t just adding pairs of numbers together and emailing
ourselves the sum. When that pair of numbers is data that our customers have
trusted us with, we have to think about how we process, store, and access that data
securely in order to maintain our customer’s trust.
Our culture of Security as Job Zero has driven us to invent mechanisms for
building, monitoring, and continuously re-evaluating our systems’ security.
Mote is an internal library of secure-by-default AWS CDK constructs that you can
add as a dependency to your cdk resources and sleep peacefully at night. Mote CDK
constructs extend standard cdk constructs and can be used interchangeably. It’s our
recommendation that you always prefer Mote constructs wherever available over
standard cdk constructs.
Note
Mote is developed for CDO (i.e., non-AWS) use cases, since it complies with that
organization’s data handling standards. In the future, Mote may be extended by AWS
security to include the data handling standards of AWS, which differ slightly. If
you’re in AWS, just be aware that we’re using Mote here as an example of how to
think about and use available tooling to create secure-by-default applications.
Refer to the Mote Workshop to learn more about Mote. Mote constructs are now
distributed via AWS CodeArtifact (aka no need to add an extra Brazil package
dependency!), you can also read about it from Goshawk .
Note
The versions of the aws-cdk-lib and @amzn/motecdk must be compatible with each
other. Use the exact versions of both packages shown below or else you may get
compilation errors.
// DynamoDB Setup
this.table = new SecureTable(this, 'Table', {
partitionKey: { name: 'id', type: AttributeType.STRING },
encryption: Exempt(TableEncryption.AWS_MANAGED),
pointInTimeRecovery: true,
billingMode: BillingMode.PAY_PER_REQUEST,
removalPolicy: RemovalPolicy.DESTROY,
});
// SNS Topic
this.topic = new SecureTopic(this, 'Topic', {
masterKey: Exempt('KeyLess'),
displayName: 'VerificationTopic'
});
// Subscribe to Email
const subscription = new SecureSubscription(this, 'EmailSubscription', {
topic: this.topic,
protocol: Exempt(SubscriptionProtocol.EMAIL),
endpoint: '[email protected]' // change alias
});
In serviceStack.ts, we’ll also be making a handful of changes to the resources to
secure them. Look at the commit for CdkWorkshopEricnCDK for all the relevant
changes.
brazil-build
If you encounter any issues while building, double-check that your motecdk and aws-
cdk-lib versions are compatible with each other.
Just swapping out the default CDK constructs to the Mote secure constructs in your
code will have yielded a large number of changes to your resources to configure
them using best practices for secure usage. Take a look at all the differences by
running:
We’ve enabled Server-Side Encryption (SSE) on the table, which protects our data-
at-rest while being stored in the database.
We’ve enabled Point In Time Recovery (PITR) on the table, which allows us to
rollback the table to any point in time, which helps protect against data loss due
to corruption or mistaken deletions.
We’ve changed our table’s capacity from Provisioned to On-Demand, which helps
protect against availability issues due to unexpected scaling.
Since none of these options are the default for the Table construct, using Mote
helps guide us to discover and address these sometimes extremely critical but
easily-overlooked settings for production systems.
git add .
git commit -m "Secure stacks using mote"
git push
Once you merge your changes, have a look at your pipeline and track the progress
until it successfully deploys to alpha. Now you’ve got a secure application running
on AWS!
Next
Day 3: Create a Development Stack
What will you learn? What a personal bootstrap stack is, how to create one, and how
to use it.
Introduction
Remember this statement we made earlier about how to deploy your code?
Pushing to mainline without some prior validation that your changes work is usually
a recipe for failures, rollbacks, and reverts. Instead, we’ll deploy this new stack
first manually to verify it does what we want.
Your reaction to that may have been “okay.. but.. deploying manually is better?” 🤨
If you were a bit skeptical about that, you’re absolutely correct. We took the step
earlier to update the application manually from our code changes without receiving
CR approval or merging the changes to mainline because this is a toy application in
our personal account which nobody else is depending on. But as you may have
guessed, this is not best practice for applications in production or which have
customers which may be impacted by a bad code change deployed manually. In this
case, getting a pipeline failure or even a rollback is really the much better
outcome for our customers than impacting them by deploying un-reviewed code
manually.
Additionally, in this workshop, you’re the only one working on this code package.
When there are multiple contributors making changes, every engineer deploying
manually to a shared environment may result in chaos. We need a better, safer way
to test our changes out without touching the live application. A Personal Bootstrap
Stack can help us achieve this, and is what we’ll be diving into in this lab.
The big idea here is to give you a way of testing your infrastructure end-to-end
before you create a CR and have it merged and deployed via pipelines, and leaving
the pipeline — and the pipeline only — in charge of updating the production stacks.
Essentially, we’ll be cloning our existing application into a new completely
separate stage which does not conflict with or overlap with the alpha version
already deployed into our account.
We’ll use this functionality to create a new CDK application which has personal
development capabilities.
To start, let’s first make a copy of our existing app.ts to a new file by running
the following command from within our CdkWorkshopUsernameCDK package:
cp lib/app.ts lib/devApp.ts
Note
If you get an error like cp: cannot stat ‘app.ts’: No such file or directory, make
sure you’re in the top-level folder of CdkWorkshopUsernameCDK (rather than lib/).
Running pwd can show you where you currently are if you’re not sure.
Now, we’ll make a few modifications to our devApp.ts to replace the pipeline
deployment environment with a personal BootstrapStack deployment environment. Refer
to CdkWorkshopEricnCDK for all the changes we’ll make to this file, and below is
what our modified devApp.ts file should look like. If you copy and paste from here,
don’t forget to replace the AWS account number, and also replace the
CdkWorkshopUsername references with your user name (but leave the other ${user}
references, as they are meant to be replaced at build time).
#!/usr/bin/env node
import { App } from 'aws-cdk-lib';
import { BootstrapStack, SoftwareType } from '@amzn/pipelines';
import { ServiceStack } from './serviceStack';
import { MonitoringStack } from './monitoringStack';
import { InfraStack } from './infraStack';
import { StepFunctionStack } from './stepFunctionStack';
const deploymentProps = {
env,
softwareType: SoftwareType.INFRASTRUCTURE,
stage: stageName,
isProd: false,
disambiguator: user,
};
"scripts": {
...(existing scripts here)...
"cdk:dev": "brazil-build cdk --require-approval never -a 'npx ts-node
lib/devApp.ts'",
"bootstrap:dev": "npm run cdk:dev deploy BONESBootstrap-*",
"deploy:dev": "npm run cdk:dev deploy *-${USER}"
},
Note
When we use a build system (such as NPM, which we’re using here or Gradle, which
your Java Lambda is using) in the Brazil ecosystem, running brazil-build (either
alone or with additional arguments) actually invokes a build target in the
underlying build system, which you can define and customize as we are doing here.
To run a custom script in the case of the NPM build system, we need to run it with
brazil-build app script-name.
For more information about build targets in Brazil, see Build Targets.
As you can see, the cdk:dev build target overrides the application to be used from
the regular lib/app.ts to our new lib/devApp.ts. This means that we should be able
to run the following command to see our new application’s output:
error TS2322: Type 'string | undefined' is not assignable to type 'string'. Type
'undefined' is not assignable to type 'string',
const user: string = process.env.USER;
Try running: kinit && mwinit
We should get:
BONESBootstrap-username-11122223334-us-west-2
CdkWorkshopUsername-Infra-dev-username
CdkWorkshopUsername-Service-dev-username
CdkWorkshopUsername-Monitoring-dev-username
CdkWorkshopUsername-StepFunction-dev-username
Sweet! It looks like we’ve successfully changed our application code to create new
stacks under the dev stage that are exclusive to ourselves, as indicated by the -
dev-username suffix.
If you get stuck or have issues building, see the CdkWorkshopEricnCDK example to
see all the changes you should have made in this section.
So why would that be the case? Wasn’t the whole point of this lab to create a
completely separate stack for ourselves which didn’t overlap at all with the
pipeline’s resources?
As it turns out, some AWS resources enforce uniqueness of resource names and do not
allow duplicates to exist in the same account at the same time. Usually this is the
case when the name of the resource is one of its key identifying characteristics,
rather than say, a randomly-generated ID. Some examples of this are:
AWS IAM - IAM User and Role resources (CDK: User / Role , Cfn: AWS::IAM::User /
AWS::IAM::Role ) are account unique. Every AWS account can have a User or Role with
the same name, but can only ever have one of these resources across all regions the
AWS account uses.
Since Lambda Functions are one of the resources whose name must be unique within an
account and region, our current strategy of hardcoding the function name in the
CDK code won’t work when we want to create a copy of this function in a different
stack. For other resources where we didn’t hardcode a name (either because the
resource doesn’t support a name at all, or if it does we just let CDK fill in the
name with a randomly-generated value ), we don’t need to do anything here.
Here is the commit for all files we are changing in this section.
We pass our username as the unique string (disambiguator) to all of our service
stacks’ deployment properties from our dev stacks (devApp.ts).
We are making our Lambda name unique: we add the readonly disambiguator?: string;
parameter to the ServiceStackProps interface, then use this parameter in the name
of our Lambda (serviceStack.ts):
Note
Be careful to replace the single quotes ( ' ) with backticks ( ` ) for any names
using the disambiguator variable in them. JavaScript does not perform string
interpolation (see https://stackoverflow.com/a/44886742 ) unless the string is
quoted with backticks, so you’ll end up literally defining a function name called
CdkWorkshopAlias${props.disambiguator... if you don’t use them :/
Step 4: Re-deploy
Once you’ve made all the disambiguator changes to generate unique resource names in
the dev stack, build your changes and make sure they succeed:
Once everything’s working, let’s get back to what we were doing before and
deploying this new stack:
Well, yeah, we have fixed the issues with regard to resource naming collisions, but
unfortunately we’ve hit another sharp edge of deploying CloudFormation. When you
try to deploy a CloudFormation stack for the first time (like how we tried to
deploy our personal service stack earlier) and something goes wrong, CloudFormation
will roll back the changes by undoing everything it tried to do, at which point it
goes into a ROLLBACK_COMPLETE state. The only thing remaining at this point is the
empty stack itself, which sticks around so that you can inspect the event log to
see what happened and fix it (what we just did).
However, once you’ve figured out and fixed the problem, it’s not possible to update
this same CloudFormation stack again. You instead need to delete and re-create it
from scratch. This is what CDK is trying to do for you. However, Termination
Protection is a safety feature of CloudFormation which is enabled on your
CloudFormation stacks by default. You can disable the termination protection by
running the following commands:
Step 5: Verify
If all went well, then you can log to your AWS console and verify that you have 4
CloudFormation stacks with -dev-username in their names which exactly mirror the -
alpha stacks. These make up your personal development environment, which you can
change, test with, tear down and re-create without any fear of impacting others.
git add .
git commit -am "Add personal development app"
git push
After your code moved through the pipeline, everything should be green.
Step 8: Recap
Our goal in this section was to set up a duplicate service under our personal
namespace that doesn’t interfere with the pipeline and its alpha stacks. However,
we ran into some issues that ended up teaching us a bit about how CDK works under
the hood, some of its limitations, and how to resolve them. Let’s take a trip down
memory lane:
We learned how CloudFormation behaves when a new stack rolls back due to an issue.
This is the set of problems we encountered during this lab and the solutions for
them we decided to use. These are not the only ways of solving these problems. For
instance, instead of disambiguating resources, we could simply deploy our dev app
to a different AWS account than our main app by changing the account ID in that
file (in fact, this is usually best practice for a real application). Or, we could
have changed the AWS region to which the dev stack deploys, since most of the
resources we needed to change are unique by account and region. In the real world,
you will have to weigh these options and the downsides/benefits of each to
determine what’s right for your use case.
Next
Day 4: Test your app
Introduction
In this part you will learn about how to test CDK constructs as part of your build.
In Day 3, you learned about Personal Bootstrap stack, which is also related to
testing. The difference here is that with Personal Bootstrap stacks you get to test
your CDK application end to end, but in this section we are focusing on testing
constructs and stacks in build time. It might help to think of the distinction as
difference between unit tests and integration/end-to-end tests. We will be demoing
Fine-grained assertions tests using a popular testing frameworks called Jest . If
you want to get more context and learn more, see Testing constructs in the AWS
Cloud Development Kit Developer Guide.
To start out writing tests, let’s create these files within the /test directory
(where our existing .ts test file is):
touch test/infra.test.ts
touch test/pipeline.test.ts
Here’s an example test written for the resources generated in our infra stack:
// Assertion tests
test('create expected Infra Resources', () => {
const infraStack = app.node.findChild('CdkWorkshopUsername-Infra-alpha') as
DeploymentStack;
const template = Template.fromStack(infraStack);
template.hasResourceProperties('AWS::DynamoDB::Table', {
BillingMode: 'PAY_PER_REQUEST',
});
We can also add specific assertions against CDK pipeline. This will ensure the
stages are created exactly where needed, with the names as expected, etc.
import {
DeploymentGroupCfnSubTarget,
DeploymentGroupTarget,
DeploymentPipeline,
Stage as PipelineStage,
} from '@amzn/pipelines';
import { app, applicationAccount } from '../lib/app';
`arn:aws:cloudformation:us-west-2:${applicationAccount}:stack/CdkWorkshopUsername-
Infra-alpha`,
`arn:aws:cloudformation:us-west-2:${applicationAccount}:stack/CdkWorkshopUsername-
Service-alpha`,
`arn:aws:cloudformation:us-west-2:${applicationAccount}:stack/CdkWorkshopUsername-
StepFunction-alpha`,
`arn:aws:cloudformation:us-west-2:${applicationAccount}:stack/CdkWorkshopUsername-
Monitoring-alpha`,
]);
stageAccountMap.set('Pipeline', [
`arn:aws:cloudformation:us-east-1:${applicationAccount}:stack/Pipeline-
CdkWorkshopUsername`,
]);
pipelineStage?.targets.forEach((target) => {
const dgGrpTarget = target as DeploymentGroupTarget;
const actualPipelineStageArns = dgGrpTarget.subTargets.map((subTarget) => {
const dgCfnSubTarget: DeploymentGroupCfnSubTarget = subTarget as
DeploymentGroupCfnSubTarget;
return dgCfnSubTarget.stack.toStackArn;
});
expect(actualPipelineStageArns.sort()).toEqual(stageAccountMap.get(pipelineStage.na
me)!.sort());
});
};
testStages(pipelineStage!);
testStages(alphaStg!);
});
You will also need to expose the app and applicationAccount variables for testing
from your app.ts file by exporting them:
Step 2: Verify
Run the tests with brazil-build:
brazil-build
The build should succeed and you should see something like:
> jest
Congratulations!🤘
Not only have you written infrastructure-as-code using CDK, you have also tested
your infrastructure’s correctness even before it deploys. Writing and committing
these tests will ensure the quality and correctness of your code is maintained even
as changes are made to it over time.
Next
Conclusion
Conclusion
Cleanup
It is important to clean up the artifacts you created while working through the
workshop. This helps to keep the tools and workflow tidy. You will need to clean up
the:
Burner account
Pipeline
Packages
Click Start Cleanup in the right column, you will be redirected to the cleanup
page.
On that page, follow the onscreen instructions for cleaning up the application
resources. Start by granting permissions, and once done, click the start cleanup
button. Wrap up by cleaning up the burner.
What’s next
With the help of CDK, the power of Infrastructure as Code is at your fingertips and
the possibilities are endless! Here are some more packages we recommend to check
out if you want to learn more about using CDK internally at Amazon as well as CDK
pipelines you can check out for more examples on how teams use these packages to
build applications that run in production.
MoteCDK – secure-by-default AWS CDK constructs. Please use them whenever possible.