0% found this document useful (0 votes)
26 views

aws_saa_c03_part_6

Uploaded by

Suyash Nxt
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views

aws_saa_c03_part_6

Uploaded by

Suyash Nxt
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Question: 1009 CertyIQ

A company runs an environment where data is stored in an Amazon S3 bucket. The objects are accessed
frequently throughout the day. The company has strict da ta encryption requirements for data that is stored in the
S3 bucket. The company currently uses AWS Key Management Service (AWS KMS) for encryption.

The company wants to optimize costs associated with encrypting S3 objects without making additional calls to
AWS KMS.

Which solution will meet these requirements?

A.Use server-side encryption with Amazon S3 managed keys (SSE-S3).


B.Use an S3 Bucket Key for server-side encryption with AWS KMS keys (SSE-KMS) on the new objects.
C.Use client-side encryption with AWS KMS customer managed keys.
D.Use server-side encryption with customer-provided keys (SSE-C) stored in AWS KMS.

Answer: B

Explanation:

Reference:

https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucket-key.html

Question: 1010 CertyIQ


A company runs multiple workloads on virtual machines (VMs) in an on-premises data center. The company is
expanding rapidly. The on-premises data center is not able to scale fast enough to meet business needs. The
company wants to migrate the workloads to AWS.

The migration is time sensitive. The company wants to use a lift-and-shift strategy for non-critical workloads.

Which combination of steps will meet these requirements? (Choose three.)

A.Use the AWS Schema Conversion Tool (AWS SCT) to collect data about the VMs.
B.Use AWS Application Migration Service. Install the AWS Replication Agent on the VMs.
C.Complete the initial replication of the VMs. Launch test instances to perform acceptance tests on the VMs.
D.Stop all operations on the VMs. Launch a cutover instance.
E.Use AWS App2Container (A2C) to collect data about the VMs.
F.Use AWS Database Migration Service (AWS DMS) to migrate the VMs.

Answer: BCD

Explanation:

B.Use AWS Application Migration Service. Install the AWS Replication Agent on the VMs.

C.Complete the initial replication of the VMs. Launch test instances to perform acceptance tests on the VMs.

D.Stop all operations on the VMs. Launch a cutover instance.

Question: 1011 CertyIQ


A company hosts an application in a private subnet. The company has already integrated the application with
Amazon Cognito. The company uses an Amazon Cognito user pool to authenticate users.

The company needs to modify the application so the application can securely store user documents in an Amazon
S3 bucket.

Which combination of steps will securely integrate Amazon S3 with the application? (Choose two.)

A.Create an Amazon Cognito identity pool to generate secure Amazon S3 access tokens for users when they
successfully log in.
B.Use the existing Amazon Cognito user pool to generate Amazon S3 access tokens for users when they
successfully log in.
C.Create an Amazon S3 VPC endpoint in the same VPC where the company hosts the application.
D.Create a NAT gateway in the VPC where the company hosts the application. Assign a policy to the S3 bucket
to deny any request that is not initiated from Amazon Cognito.
E.Attach a policy to the S3 bucket that allows access only from the users' IP addresses.

Answer: AC

Explanation:

A.Create an Amazon Cognito identity pool to generate secure Amazon S3 access tokens for users when they
successfully log in.

C.Create an Amazon S3 VPC endpoint in the same VPC where the company hosts the application.

Question: 1012 CertyIQ


A company has a three-tier web application that processes orders from customers. The web tier consists of
Amazon EC2 instances behind an Application Load Balancer. The processing tier consists of EC2 instances. The
company decoupled the web tier and processing tier by using Amazon Simple Queue Service (Amazon SQS). The
storage layer uses Amazon DynamoDB.

At peak times, some users report order processing delays and halls. The company has noticed that during these
delays, the EC2 instances are running at 100% CPU usage, and the SQS queue fills up. The peak times are variable
and unpredictable.

The company needs to improve the performance of the application.

Which solution will meet these requirements?

A.Use scheduled scaling for Amazon EC2 Auto Scaling to scale out the processing tier instances for the
duration of peak usage times. Use the CPU Utilization metric to determine when to scale.
B.Use Amazon ElastiCache for Redis in front of the DynamoDB backend tier. Use target utilization as a metric to
determine when to scale.
C.Add an Amazon CloudFront distribution to cache the responses for the web tier. Use HTTP latency as a metric
to determine when to scale.
D.Use an Amazon EC2 Auto Scaling target tracking policy to scale out the processing tier instances. Use the
ApproximateNumberOfMessages attribute to determine when to scale.

Answer: D

Explanation:

Use an Amazon EC2 Auto Scaling target tracking policy to scale out the processing tier instances. Use the
ApproximateNumberOfMessages attribute to determine when to scale.
Question: 1013 CertyIQ
A company's production environment consists of Amazon EC2 On-Demand Instances that run constantly between
Monday and Saturday. The instances must run for only 12 hours on Sunday and cannot tolerate interruptions. The
company wants to cost-optimize the production environment.

Which solution will meet these requirements MOST cost-effectively?

A.Purchase Scheduled Reserved Instances for the EC2 instances that run for only 12 hours on Sunday.
Purchase Standard Reserved Instances for the EC2 instances that run constantly between Monday and
Saturday.
B.Purchase Convertible Reserved Instances for the EC2 instances that run for only 12 hours on Sunday.
Purchase Standard Reserved Instances for the EC2 instances that run constantly between Monday and
Saturday.
C.Use Spot Instances for the EC2 instances that run for only 12 hours on Sunday. Purchase Standard Reserved
Instances for the EC2 instances that run constantly between Monday and Saturday.
D.Use Spot Instances for the EC2 instances that run for only 12 hours on Sunday. Purchase Convertible
Reserved Instances for the EC2 instances that run constantly between Monday and Saturday.

Answer: A

Explanation:

Purchase Scheduled Reserved Instances for the EC2 instances that run for only 12 hours on Sunday. Purchase
Standard Reserved Instances for the EC2 instances that run constantly between Monday and Saturday.

Question: 1014 CertyIQ


A digital image processing company wants to migrate its on-premises monolithic application to the AWS Cloud.
The company processes thousands of images and generates large files as part of the processing workflow.

The company needs a solution to manage the growing number of image processing jobs. The solution must also
reduce the manual tasks in the image processing workflow. The company does not want to manage the underlying
infrastructure of the solution.

Which solution will meet these requirements with the LEAST operational overhead?

A.Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 Spot Instances to process the images.
Configure Amazon Simple Queue Service (Amazon SQS) to orchestrate the workflow. Store the processed files
in Amazon Elastic File System (Amazon EFS).
B.Use AWS Batch jobs to process the images. Use AWS Step Functions to orchestrate the workflow. Store the
processed files in an Amazon S3 bucket.
C.Use AWS Lambda functions and Amazon EC2 Spot Instances to process the images. Store the processed files
in Amazon FSx.
D.Deploy a group of Amazon EC2 instances to process the images. Use AWS Step Functions to orchestrate the
workflow. Store the processed files in an Amazon Elastic Block Store (Amazon EBS) volume.

Answer: B

Explanation:

Use AWS Batch jobs to process the images. Use AWS Step Functions to orchestrate the workflow. Store the
processed files in an Amazon S3 bucket.
Question: 1015 CertyIQ
A company's image-hosting website gives users around the world the ability to up load, view, and download images
from their mobile devices. The company currently hosts the static website in an Amazon S3 bucket.

Because of the website's growing popularity, the website's performance has decreased. Users have reported
latency issues when they upload and download images.

The company must improve the performance of the website.

Which solution will meet these requirements with the LEAST implementation effort?

A.Configure an Amazon CloudFront distribution for the S3 bucket to improve the download performance.
Enable S3 Transfer Acceleration to improve the upload performance.
B.Configure Amazon EC2 instances of the right sizes in multiple AWS Regions. Migrate the application to the
EC2 instances. Use an Application Load Balancer to distribute the website traffic equally among the EC2
instances. Configure AWS Global Accelerator to address global demand with low latency.
C.Configure an Amazon CloudFront distribution that uses the S3 bucket as an origin to improve the download
performance. Configure the application to use CloudFront to upload images to improve the upload
performance. Create S3 buckets in multiple AWS Regions. Configure replication rules for the buckets to
replicate users' data based on the users' location. Redirect downloads to the S3 bucket that is closest to each
user's location.
D.Configure AWS Global Accelerator for the S3 bucket to improve network performance. Create an endpoint
for the application to use Global Accelerator instead of the S3 bucket.

Answer: A

Explanation:

Configure an Amazon CloudFront distribution for the S3 bucket to improve the download performance. Enable
S3 Transfer Acceleration to improve the upload performance.

Question: 1016 CertyIQ


A company runs an application in a private subnet behind an Application Load Balancer (ALB) in a VPC. The VPC
has a NAT gateway and an internet gateway. The application calls the Amazon S3 API to store objects.

According to the company's security policy, traffic from the application must not travel across the internet.

Which solution will meet these requirements MOST cost-effectively?

A.Configure an S3 interface endpoint. Create a security group that allows outbound traffic to Amazon S3.
B.Configure an S3 gateway endpoint. Update the VPC route table to use the endpoint.
C.Configure an S3 bucket policy to allow traffic from the Elastic IP address that is assigned to the NAT
gateway.
D.Create a second NAT gateway in the same subnet where the legacy application is deployed. Update the VPC
route table to use the second NAT gateway.

Answer: B

Explanation:

Configure an S3 gateway endpoint. Update the VPC route table to use the endpoint.
Question: 1017 CertyIQ
A company has an application that runs on an Amazon Elastic Kubernetes Service (Amazon EKS) cluster on
Amazon EC2 instances. The application has a UI that uses Amazon DynamoDB and data services that use Amazon
S3 as part of the application deployment.

The company must ensure that the EKS Pods for the UI can access only Amazon DynamoDB and that the EKS Pods
for the data services can access only Amazon S3. The company uses AWS Identity and Access Management (IAM).

Which solution meals these requirements?

A.Create separate IAM policies for Amazon S3 and DynamoDB access with the required permissions. Attach
both IAM policies to the EC2 instance profile. Use role-based access control (RBAC) to control access to
Amazon S3 or DynamoDB for the respective EKS Pods.
B.Create separate IAM policies for Amazon S3 and DynamoDB access with the required permissions. Attach the
Amazon S3 IAM policy directly to the EKS Pods for the data services and the DynamoDB policy to the EKS Pods
for the UI.
C.Create separate Kubernetes service accounts for the UI and data services to assume an IAM role. Attach the
AmazonS3FullAccess policy to the data services account and the AmazonDynamoDBFullAccess policy to the UI
service account.
D.Create separate Kubernetes service accounts for the UI and data services to assume an IAM role. Use IAM
Role for Service Accounts (IRSA) to provide access to the EKS Pods for the UI to Amazon S3 and the EKS Pods
for the data services to DynamoDB.

Answer: D

Explanation:

https://docs.aws.amazon.com/eks/latest/userguide/service-accounts.html#service-accounts-iam

Question: 1018 CertyIQ


A company needs to give a globally distributed development team secure access to the company's AWS resources
in a way that complies with security policies.

The company currently uses an on-premises Active Directory for internal authentication. The company uses AWS
Organizations to manage multiple AWS accounts that support multiple projects.

The company needs a solution to integrate with the existing infrastructure to provide centralized identity
management and access control.

Which solution will meet these requirements with the LEAST operational overhead?

A.Set up AWS Directory Service to create an AWS managed Microsoft Active Directory on AWS. Establish a
trust relationship with the on-premises Active Directory. Use IAM rotes that are assigned to Active Directory
groups to access AWS resources within the company's AWS accounts.
B.Create an IAM user for each developer. Manually manage permissions for each IAM user based on each user's
involvement with each project. Enforce multi-factor authentication (MFA) as an additional layer of security.
C.Use AD Connector in AWS Directory Service to connect to the on-premises Active Directory. Integrate AD
Connector with AWS IAM Identity Center. Configure permissions sets to give each AD group access to specific
AWS accounts and resources.
D.Use Amazon Cognito to deploy an identity federation solution. Integrate the identity federation solution with
the on-premises Active Directory. Use Amazon Cognito to provide access tokens for developers to access AWS
accounts and resources.

Answer: C

Explanation:
Use AD Connector in AWS Directory Service to connect to the on-premises Active Directory. Integrate AD
Connector with AWS IAM Identity Center. Configure permissions sets to give each AD group access to
specific AWS accounts and resources.

Question: 1019 CertyIQ


A company is developing an application in the AWS Cloud. The application's HTTP API contains critical information
that is published in Amazon API Gateway. The critical information must be accessible from only a limited set of
trusted IP addresses that belong to the company's internal network.

Which solution will meet these requirements?

A.Set up an API Gateway private integration to restrict access to a predefined set of IP addresses.
B.Create a resource policy for the API that denies access to any IP address that is not specifically allowed.
C.Directly deploy the API in a private subnet. Create a network ACL. Set up rules to allow the traffic from
specific IP addresses.
D.Modify the security group that is attached to API Gateway to allow inbound traffic from only the trusted IP
addresses.

Answer: B

Explanation:

Create a resource policy for the API that denies access to any IP address that is not specifically allowed.
Thank you
Thank you for being so interested in the premium exam material.
I'm glad to hear that you found it informative and helpful.

If you have any feedback or thoughts on the bumps, I would love to hear them.
Your insights can help me improve our writing and better understand our readers.

Best of Luck
You have worked hard to get to this point, and you are well-prepared for the exam
Keep your head up, stay positive, and go show that exam what you're made of!

Feedback More Papers

Total: 1019 Questions


Link: https://certyiq.com/papers/amazon/aws-certified-solutions-architect-associate-saa-c03

You might also like