0% found this document useful (1 vote)
208 views

Real Exam Question Amazonaws

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (1 vote)
208 views

Real Exam Question Amazonaws

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps

https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

Exam Questions AWS-Solution-Architect-Associate


Amazon AWS Certified Solutions Architect - Associate

https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

NEW QUESTION 1
- (Exam Topic 1)
A company is designing an application. The application uses an AWS Lambda function to receive information through Amazon API Gateway and to store the
information in an Amazon Aurora PostgreSQL database.
During the proof-of-concept stage, the company has to increase the Lambda quotas significantly to handle the high volumes of data that the company needs to
load into the database. A solutions architect must recommend a new design to improve scalability and minimize the configuration effort.
Which solution will meet these requirements?

A. Refactor the Lambda function code to Apache Tomcat code that runs on Amazon EC2 instances.Connect the database by using native Java Database
Connectivity (JDBC) drivers.
B. Change the platform from Aurora to Amazon DynamoD
C. Provision a DynamoDB Accelerator (DAX) cluste
D. Use the DAX client SDK to point the existing DynamoDB API calls at the DAX cluster.
E. Set up two Lambda function
F. Configure one function to receive the informatio
G. Configure the other function to load the information into the databas
H. Integrate the Lambda functions by using Amazon Simple Notification Service (Amazon SNS).
I. Set up two Lambda function
J. Configure one function to receive the informatio
K. Configure the other function to load the information into the databas
L. Integrate the Lambda functions by using an Amazon Simple Queue Service (Amazon SQS) queue.

Answer: B

Explanation:
bottlenecks can be avoided with queues (SQS).

NEW QUESTION 2
- (Exam Topic 1)
A company performs monthly maintenance on its AWS infrastructure. During these maintenance activities, the company needs to rotate the credentials tor its
Amazon ROS tor MySQL databases across multiple AWS Regions
Which solution will meet these requirements with the LEAST operational overhead?

A. Store the credentials as secrets in AWS Secrets Manage


B. Use multi-Region secret replication for the required Regions Configure Secrets Manager to rotate the secrets on a schedule
C. Store the credentials as secrets in AWS Systems Manager by creating a secure string parameter Use multi-Region secret replication for the required Regions
Configure Systems Manager to rotate the secrets on a schedule
D. Store the credentials in an Amazon S3 bucket that has server-side encryption (SSE) enabled Use Amazon EventBridge (Amazon CloudWatch Events) to invoke
an AWS Lambda function to rotate the credentials
E. Encrypt the credentials as secrets by using AWS Key Management Service (AWS KMS) multi-Region customer managed keys Store the secrets in an Amazon
DynamoDB global table Use an AWS Lambda function to retrieve the secrets from DynamoDB Use the RDS API to rotate the secrets.

Answer: A

Explanation:
https://aws.amazon.com/blogs/security/how-to-replicate-secrets-aws-secrets-manager-multiple-regions/

NEW QUESTION 3
- (Exam Topic 1)
An application allows users at a company's headquarters to access product data. The product data is stored in an Amazon RDS MySQL DB instance. The
operations team has isolated an application performance slowdown and wants to separate read traffic from write traffic. A solutions architect needs to optimize the
application's performance quickly.
What should the solutions architect recommend?

A. Change the existing database to a Multi-AZ deploymen


B. Serve the read requests from the primary Availability Zone.
C. Change the existing database to a Multi-AZ deploymen
D. Serve the read requests from the secondary Availability Zone.
E. Create read replicas for the databas
F. Configure the read replicas with half of the compute and storage resources as the source database.
G. Create read replicas for the databas
H. Configure the read replicas with the same compute and storage resources as the source database.

Answer: D

Explanation:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_MySQL.Replication.ReadReplicas.html

NEW QUESTION 4
- (Exam Topic 1)
A company hosts its multi-tier applications on AWS. For compliance, governance, auditing, and security, the company must track configuration changes on its
AWS resources and record a history of API calls made to these resources.
What should a solutions architect do to meet these requirements?

A. Use AWS CloudTrail to track configuration changes and AWS Config to record API calls
B. Use AWS Config to track configuration changes and AWS CloudTrail to record API calls
C. Use AWS Config to track configuration changes and Amazon CloudWatch to record API calls
D. Use AWS CloudTrail to track configuration changes and Amazon CloudWatch to record API calls

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

Answer: B

NEW QUESTION 5
- (Exam Topic 1)
A company has an application that ingests incoming messages. These messages are then quickly consumed by dozens of other applications and microservices.
The number of messages varies drastically and sometimes spikes as high as 100,000 each second. The
company wants to decouple the solution and increase scalability. Which solution meets these requirements?

A. Persist the messages to Amazon Kinesis Data Analytic


B. All the applications will read and process the messages.
C. Deploy the application on Amazon EC2 instances in an Auto Scaling group, which scales the number of EC2 instances based on CPU metrics.
D. Write the messages to Amazon Kinesis Data Streams with a single shar
E. All applications will read from the stream and process the messages.
F. Publish the messages to an Amazon Simple Notification Service (Amazon SNS) topic with one or more Amazon Simple Queue Service (Amazon SQS)
subscription
G. All applications then process the messages from the queues.

Answer: D

Explanation:
https://aws.amazon.com/sqs/features/
By routing incoming requests to Amazon SQS, the company can decouple the job requests from the processing instances. This allows them to scale the number of
instances based on the size of the queue, providing more resources when needed. Additionally, using an Auto Scaling group based on the queue size will
automatically scale the number of instances up or down depending on the workload. Updating the software to read from the queue will allow it to process the job
requests in a more efficient manner, improving the performance of the system.

NEW QUESTION 6
- (Exam Topic 1)
A company recently migrated to AWS and wants to implement a solution to protect the traffic that flows in and out of the production VPC. The company had an
inspection server in its on-premises data center. The inspection server performed specific operations such as traffic flow inspection and traffic filtering. The
company wants to have the same functionalities in the AWS Cloud.
Which solution will meet these requirements?

A. Use Amazon GuardDuty for traffic inspection and traffic filtering in the production VPC
B. Use Traffic Mirroring to mirror traffic from the production VPC for traffic inspection and filtering.
C. Use AWS Network Firewall to create the required rules for traffic inspection and traffic filtering for the production VPC.
D. Use AWS Firewall Manager to create the required rules for traffic inspection and traffic filtering for the production VPC.

Answer: C

Explanation:
AWS Network Firewall supports both inspection and filtering as required

NEW QUESTION 7
- (Exam Topic 1)
A solutions architect is developing a multiple-subnet VPC architecture. The solution will consist of six subnets in two Availability Zones. The subnets are defined as
public, private and dedicated for databases. Only the Amazon EC2 instances running in the private subnets should be able to access a database.
Which solution meets these requirements?

A. Create a now route table that excludes the route to the public subnets' CIDR block
B. Associate the route table to the database subnets.
C. Create a security group that denies ingress from the security group used by instances in the public subnet
D. Attach the security group to an Amazon RDS DB instance.
E. Create a security group that allows ingress from the security group used by instances in the private subnet
F. Attach the security group to an Amazon RDS DB instance.
G. Create a new peering connection between the public subnets and the private subnet
H. Create a different peering connection between the private subnets and the database subnets.

Answer: C

Explanation:
Security groups are stateful. All inbound traffic is blocked by default. If you create an inbound rule allowing traffic in, that traffic is automatically allowed back out
again. You cannot block specific IP address using Security groups (instead use Network Access Control Lists).
"You can specify allow rules, but not deny rules." "When you first create a security group, it has no inbound rules. Therefore, no inbound traffic originating from
another host to your instance is allowed until you add inbound rules to the security group." Source:
https://docs.aws.amazon.com/vpc/latest/userguide/VPC_SecurityGroups.html#VPCSecurityGroups

NEW QUESTION 8
- (Exam Topic 1)
An Amazon EC2 administrator created the following policy associated with an IAM group containing several users

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

What is the effect of this policy?

A. Users can terminate an EC2 instance in any AWS Region except us-east-1.
B. Users can terminate an EC2 instance with the IP address 10 100 100 1 in the us-east-1 Region
C. Users can terminate an EC2 instance in the us-east-1 Region when the user's source IP is 10.100.100.254.
D. Users cannot terminate an EC2 instance in the us-east-1 Region when the user's source IP is 10.100.100.254

Answer: C

Explanation:
as the policy prevents anyone from doing any EC2 action on any region except us-east-1 and allows only users with source ip 10.100.100.0/24 to terminate
instances. So user with source ip 10.100.100.254 can terminate instances in us-east-1 region.

NEW QUESTION 9
- (Exam Topic 1)
A company needs to configure a real-time data ingestion architecture for its application. The company needs an API, a process that transforms data as the data is
streamed, and a storage solution for the data.
Which solution will meet these requirements with the LEAST operational overhead?

A. Deploy an Amazon EC2 instance to host an API that sends data to an Amazon Kinesis data stream.Create an Amazon Kinesis Data Firehose delivery stream
that uses the Kinesis data stream as a data sourc
B. Use AWS Lambda functions to transform the dat
C. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
D. Deploy an Amazon EC2 instance to host an API that sends data to AWS Glu
E. Stop source/destination checking on the EC2 instanc
F. Use AWS Glue to transform the data and to send the data to Amazon S3.
G. Configure an Amazon API Gateway API to send data to an Amazon Kinesis data strea
H. Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data sourc
I. Use AWS Lambda functions to transform the dat
J. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
K. Configure an Amazon API Gateway API to send data to AWS Glu
L. Use AWS Lambda functions to transform the dat
M. Use AWS Glue to send the data to Amazon S3.

Answer: C

NEW QUESTION 10
- (Exam Topic 1)
A company uses 50 TB of data for reporting. The company wants to move this data from on premises to AWS A custom application in the company's data center
runs a weekly data transformation job. The company plans to pause the application until the data transfer is complete and needs to begin the transfer process as
soon as possible.
The data center does not have any available network bandwidth for additional workloads A solutions architect must transfer the data and must configure the
transformation job to continue to run in the AWS Cloud
Which solution will meet these requirements with the LEAST operational overhead?

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

A. Use AWS DataSync to move the data Create a custom transformation job by using AWS Glue
B. Order an AWS Snowcone device to move the data Deploy the transformation application to the device
C. Order an AWS Snowball Edge Storage Optimized devic
D. Copy the data to the devic
E. Create a custom transformation job by using AWS Glue
F. Order an AWS
G. Snowball Edge Storage Optimized device that includes Amazon EC2 compute Copy the data to the device Create a new EC2 instance on AWS to run the
transformation application

Answer: C

NEW QUESTION 10
- (Exam Topic 1)
An image-processing company has a web application that users use to upload images. The application uploads the images into an Amazon S3 bucket. The
company has set up S3 event notifications to publish the object creation events to an Amazon Simple Queue Service (Amazon SQS) standard queue. The SQS
queue serves as the event source for an AWS Lambda function that processes the images and sends the results to users through email.
Users report that they are receiving multiple email messages for every uploaded image. A solutions architect determines that SQS messages are invoking the
Lambda function more than once, resulting in multiple email messages.
What should the solutions architect do to resolve this issue with the LEAST operational overhead?

A. Set up long polling in the SQS queue by increasing the ReceiveMessage wait time to 30 seconds.
B. Change the SQS standard queue to an SQS FIFO queu
C. Use the message deduplication ID to discard duplicate messages.
D. Increase the visibility timeout in the SQS queue to a value that is greater than the total of the function timeout and the batch window timeout.
E. Modify the Lambda function to delete each message from the SQS queue immediately after the message is read before processing.

Answer: C

NEW QUESTION 14
- (Exam Topic 1)
A company has an Amazon S3 bucket that contains critical data. The company must protect the data from accidental deletion.
Which combination of steps should a solutions architect take to meet these requirements? (Choose two.)

A. Enable versioning on the S3 bucket.


B. Enable MFA Delete on the S3 bucket.
C. Create a bucket policy on the S3 bucket.
D. Enable default encryption on the S3 bucket.
E. Create a lifecycle policy for the objects in the S3 bucket.

Answer: AB

NEW QUESTION 18
- (Exam Topic 1)
A company hosts a data lake on AWS. The data lake consists of data in Amazon S3 and Amazon RDS for PostgreSQL. The company needs a reporting solution
that provides data visualization and includes all the data sources within the data lake. Only the company's management team should have full access to all the
visualizations. The rest of the company should have only limited access.
Which solution will meet these requirements?

A. Create an analysis in Amazon QuickSigh


B. Connect all the data sources and create new dataset
C. Publish dashboards to visualize the dat
D. Share the dashboards with the appropriate IAM roles.
E. Create an analysis in Amazon OuickSigh
F. Connect all the data sources and create new dataset
G. Publish dashboards to visualize the dat
H. Share the dashboards with the appropriate users and groups.
I. Create an AWS Glue table and crawler for the data in Amazon S3. Create an AWS Glue extract, transform, and load (ETL) job to produce report
J. Publish the reports to Amazon S3. Use S3 bucket policies to limit access to the reports.
K. Create an AWS Glue table and crawler for the data in Amazon S3. Use Amazon Athena Federated Query to access data within Amazon RDS for PoslgreSQ
L. Generate reports by using Amazon Athen
M. Publish the reports to Amazon S3. Use S3 bucket policies to limit access to the reports.

Answer: A

NEW QUESTION 19
- (Exam Topic 1)
A solutions architect must design a highly available infrastructure for a website. The website is powered by Windows web servers that run on Amazon EC2
instances. The solutions architect must implement a solution that can mitigate a large-scale DDoS attack that originates from thousands of IP addresses.
Downtime is not acceptable for the website.
Which actions should the solutions architect take to protect the website from such an attack? (Select TWO.)

A. Use AWS Shield Advanced to stop the DDoS attack.


B. Configure Amazon GuardDuty to automatically block the attackers.
C. Configure the website to use Amazon CloudFront for both static and dynamic content.
D. Use an AWS Lambda function to automatically add attacker IP addresses to VPC network ACLs.
E. Use EC2 Spot Instances in an Auto Scaling group with a target tracking scaling policy that is set to 80% CPU utilization

Answer: AC

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

Explanation:
(https://aws.amazon.com/cloudfront

NEW QUESTION 21
- (Exam Topic 1)
A company has more than 5 TB of file data on Windows file servers that run on premises Users and applications interact with the data each day
The company is moving its Windows workloads to AWS. As the company continues this process, the company requires access to AWS and on-premises file
storage with minimum latency The company needs a solution that minimizes operational overhead and requires no significant changes to the existing file access
patterns. The company uses an AWS Site-to-Site VPN connection for connectivity to AWS
What should a solutions architect do to meet these requirements?

A. Deploy and configure Amazon FSx for Windows File Server on AW


B. Move the on-premises file data to FSx for Windows File Serve
C. Reconfigure the workloads to use FSx for Windows File Server on AWS.
D. Deploy and configure an Amazon S3 File Gateway on premises Move the on-premises file data to the S3 File Gateway Reconfigure the on-premises workloads
and the cloud workloads to use the S3 File Gateway
E. Deploy and configure an Amazon S3 File Gateway on premises Move the on-premises file data to Amazon S3 Reconfigure the workloads to use either Amazon
S3 directly or the S3 File Gateway, depending on each workload's location
F. Deploy and configure Amazon FSx for Windows File Server on AWS Deploy and configure an Amazon FSx File Gateway on premises Move the on-premises
file data to the FSx File Gateway Configure the cloud workloads to use FSx for Windows File Server on AWS Configure the on-premises workloads to use the FSx
File Gateway

Answer: D

NEW QUESTION 24
- (Exam Topic 1)
A bicycle sharing company is developing a multi-tier architecture to track the location of its bicycles during peak operating hours The company wants to use these
data points in its existing analytics platform A solutions architect must determine the most viable multi-tier option to support this architecture The data points must
be accessible from the REST API.
Which action meets these requirements for storing and retrieving location data?

A. Use Amazon Athena with Amazon S3


B. Use Amazon API Gateway with AWS Lambda
C. Use Amazon QuickSight with Amazon Redshift.
D. Use Amazon API Gateway with Amazon Kinesis Data Analytics

Answer: D

Explanation:
https://aws.amazon.com/solutions/implementations/aws-streaming-data-solution-for-amazon-kinesis/

NEW QUESTION 29
- (Exam Topic 1)
A company uses Amazon S3 to store its confidential audit documents. The S3 bucket uses bucket policies to restrict access to audit team IAM user credentials
according to the principle of least privilege. Company managers are worried about accidental deletion of documents in the S3 bucket and want a more secure
solution.
What should a solutions architect do to secure the audit documents?

A. Enable the versioning and MFA Delete features on the S3 bucket.


B. Enable multi-factor authentication (MFA) on the IAM user credentials for each audit team IAM user account.
C. Add an S3 Lifecycle policy to the audit team's IAM user accounts to deny the s3:DeleteObject action during audit dates.
D. Use AWS Key Management Service (AWS KMS) to encrypt the S3 bucket and restrict audit team IAM user accounts from accessing the KMS key.

Answer: A

NEW QUESTION 31
- (Exam Topic 1)
A company uses AWS Organizations to manage multiple AWS accounts for different departments. The management account has an Amazon S3 bucket that
contains project reports. The company wants to limit access to this S3 bucket to only users of accounts within the organization in AWS Organizations.
Which solution meets these requirements with the LEAST amount of operational overhead?

A. Add the aws:PrincipalOrgID global condition key with a reference to the organization ID to the S3 bucket policy.
B. Create an organizational unit (OU) for each departmen
C. Add the aws:PrincipalOrgPaths global condition key to the S3 bucket policy.
D. Use AWS CloudTrail to monitor the CreateAccount, InviteAccountToOrganization, LeaveOrganization, and RemoveAccountFromOrganization event
E. Update the S3 bucket policy accordingly.
F. Tag each user that needs access to the S3 bucke
G. Add the aws:PrincipalTag global condition key to the S3 bucket policy.

Answer: A

Explanation:
https://aws.amazon.com/blogs/security/control-access-to-aws-resources-by-using-the-aws-organization-of-iam-p The aws:PrincipalOrgID global key provides an
alternative to listing all the account IDs for all AWS accounts in an organization. For example, the following Amazon S3 bucket policy allows members of any
account in the XXX organization to add an object into the examtopics bucket.
{"Version": "2020-09-10",
"Statement": {
"Sid": "AllowPutObject", "Effect": "Allow",
"Principal": "*", "Action": "s3:PutObject",

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

"Resource": "arn:aws:s3:::examtopics/*", "Condition": {"StringEquals":


{"aws:PrincipalOrgID":["XXX"]}}}} https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html

NEW QUESTION 33
- (Exam Topic 1)
A company has a three-tier web application that is deployed on AWS. The web servers are deployed in a public subnet in a VPC. The application servers and
database servers are deployed in private subnets in the same VPC. The company has deployed a third-party virtual firewall appliance from AWS Marketplace in an
inspection VPC. The appliance is configured with an IP interface that can accept IP packets.
A solutions architect needs to Integrate the web application with the appliance to inspect all traffic to the application before the traffic teaches the web server.
Which solution will moot these requirements with the LEAST operational overhead?

A. Create a Network Load Balancer the public subnet of the application's VPC to route the traffic lo the appliance for packet inspection
B. Create an Application Load Balancer in the public subnet of the application's VPC to route the traffic to the appliance for packet inspection
C. Deploy a transit gateway m the inspection VPC Configure route tables to route the incoming pockets through the transit gateway
D. Deploy a Gateway Load Balancer in the inspection VPC Create a Gateway Load Balancer endpoint to receive the incoming packets and forward the packets to
the appliance

Answer: D

Explanation:
https://aws.amazon.com/blogs/networking-and-content-delivery/scaling-network-traffic-inspection-using-aws-ga

NEW QUESTION 38
- (Exam Topic 1)
A company is deploying a new public web application to AWS. The application will run behind an Application Load Balancer (ALB). The application needs to be
encrypted at the edge with an SSL/TLS certificate that is issued by an external certificate authority (CA). The certificate must be rotated each year before the
certificate expires.
What should a solutions architect do to meet these requirements?

A. Use AWS Certificate Manager (ACM) to issue an SSL/TLS certificat


B. Apply the certificate to the AL
C. Use the managed renewal feature to automatically rotate the certificate.
D. Use AWS Certificate Manager (ACM) to issue an SSL/TLS certificat
E. Import the key material from the certificat
F. Apply the certificate to the AL
G. Use the managed renewal feature to automatically rotate the certificate.
H. Use AWS Certificate Manager (ACM) Private Certificate Authority to issue an SSL/TLS certificate from the root C
I. Apply the certificate to the AL
J. Use the managed renewal feature to automatically rotate the certificate.
K. Use AWS Certificate Manager (ACM) to import an SSL/TLS certificat
L. Apply the certificate to the AL
M. Use Amazon EventBridge (Amazon CloudWatch Events) to send a notification when the certificate is nearing expiratio
N. Rotate the certificate manually.

Answer: D

NEW QUESTION 40
- (Exam Topic 1)
A company wants to migrate its on-premises application to AWS. The application produces output files that vary in size from tens of gigabytes to hundreds of
terabytes The application data must be stored in a standard file system structure The company wants a solution that scales automatically, is highly available, and
requires minimum operational overhead.
Which solution will meet these requirements?

A. Migrate the application to run as containers on Amazon Elastic Container Service (Amazon ECS) Use Amazon S3 for storage
B. Migrate the application to run as containers on Amazon Elastic Kubernetes Service (Amazon EKS) Use Amazon Elastic Block Store (Amazon EBS) for storage
C. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
D. Use Amazon Elastic File System (Amazon EFS) for storage.
E. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
F. Use Amazon Elastic Block Store (Amazon EBS) for storage.

Answer: C

Explanation:
EFS is a standard file system, it scales automatically and is highly available.

NEW QUESTION 41
- (Exam Topic 1)
A company hosts more than 300 global websites and applications. The company requires a platform to analyze more than 30 TB of clickstream data each day.
What should a solutions architect do to transmit and process the clickstream data?

A. Design an AWS Data Pipeline to archive the data to an Amazon S3 bucket and run an Amazon EMR duster with the data to generate analytics
B. Create an Auto Scaling group of Amazon EC2 instances to process the data and send it to an Amazon S3 data lake for Amazon Redshift to use tor analysis
C. Cache the data to Amazon CloudFron: Store the data in an Amazon S3 bucket When an object is added to the S3 bucket, run an AWS Lambda function to
process the data tor analysis.
D. Collect the data from Amazon Kinesis Data Stream
E. Use Amazon Kinesis Data Firehose to transmit the data to an Amazon S3 data lake Load the data in Amazon Redshift for analysis

Answer: D

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

Explanation:
https://aws.amazon.com/es/blogs/big-data/real-time-analytics-with-amazon-redshift-streaming-ingestion/

NEW QUESTION 42
- (Exam Topic 1)
A company runs an ecommerce application on Amazon EC2 instances behind an Application Load Balancer. The instances run in an Amazon EC2 Auto Scaling
group across multiple Availability Zones. The Auto Scaling group scales based on CPU utilization metrics. The ecommerce application stores the transaction data
in a MySQL 8.0 database that is hosted on a large EC2 instance.
The database's performance degrades quickly as application load increases. The application handles more read requests than write transactions. The company
wants a solution that will automatically scale the database to meet the demand of unpredictable read workloads while maintaining high availability.
Which solution will meet these requirements?

A. Use Amazon Redshift with a single node for leader and compute functionality.
B. Use Amazon RDS with a Single-AZ deployment Configure Amazon RDS to add reader instances in a different Availability Zone.
C. Use Amazon Aurora with a Multi-AZ deploymen
D. Configure Aurora Auto Scaling with Aurora Replicas.
E. Use Amazon ElastiCache for Memcached with EC2 Spot Instances.

Answer: C

Explanation:
AURORA is 5x performance improvement over MySQL on RDS and handles more read requests than write,; maintaining high availability = Multi-AZ deployment

NEW QUESTION 45
- (Exam Topic 2)
A company stores its application logs in an Amazon CloudWatch Logs log group. A new policy requires the company to store all application logs in Amazon
OpenSearch Service (Amazon Elasticsearch Service) in near-real time.
Which solution will meet this requirement with the LEAST operational overhead?

A. Configure a CloudWatch Logs subscription to stream the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service).
B. Create an AWS Lambda functio
C. Use the log group to invoke the function to write the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service).
D. Create an Amazon Kinesis Data Firehose delivery strea
E. Configure the log group as the delivery stream's sourc
F. Configure Amazon OpenSearch Service (Amazon Elasticsearch Service) as the delivery stream's destination.
G. Install and configure Amazon Kinesis Agent on each application server to deliver the logs to Amazon Kinesis Data Stream
H. Configure Kinesis Data Streams to deliver the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service)

Answer: B

Explanation:
https://computingforgeeks.com/stream-logs-in-aws-from-cloudwatch-to-elasticsearch/

NEW QUESTION 48
- (Exam Topic 2)
A gaming company hosts a browser-based application on AWS. The users of the application consume a large number of videos and images that are stored in
Amazon S3. This content is the same for all users.
The application has increased in popularity, and millions of users worldwide are accessing these media files. The company wants to provide the files to the users
while reducing the load on the origin.
Which solution meets these requirements MOST cost-effectively?

A. Deploy an AWS Global Accelerator accelerator in front of the web servers.


B. Deploy an Amazon CloudFront web distribution in front of the S3 bucket.
C. Deploy an Amazon ElastiCache for Redis instance in front of the web servers.
D. Deploy an Amazon ElastiCache for Memcached instance in front of the web servers.

Answer: B

NEW QUESTION 52
- (Exam Topic 2)
A company needs to move data from an Amazon EC2 instance to an Amazon S3 bucket. The company must ensure that no API calls and no data are routed
through public internet routes. Only the EC2 instance can have access to upload data to the S3 bucket.
Which solution will meet these requirements?

A. Create an interface VPC endpoint for Amazon S3 in the subnet where the EC2 instance is located.Attach a resource policy to the S3 bucket to only allow the
EC2 instance's IAM role for access.
B. Create a gateway VPC endpoint for Amazon S3 in the Availability Zone where the EC2 instance is locate
C. Attach appropriate security groups to the endpoin
D. Attach a resource policy lo the S3 bucket to only allow the EC2 instance's IAM role for access.
E. Run the nslookup tool from inside the EC2 instance to obtain the private IP address of the S3 bucket's service API endpoin
F. Create a route in the VPC route table to provide the EC2 instance with access to the S3 bucke
G. Attach a resource policy to the S3 bucket to only allow the EC2 instance's IAM role for access.
H. Use the AWS provided, publicly available ip-ranges.json tile to obtain the private IP address of the S3 bucket's service API endpoin
I. Create a route in the VPC route table to provide the EC2 instance with access to the S3 bucke
J. Attach a resource policy to the S3 bucket to only allow the EC2 instance's IAM role for access.

Answer: A

Explanation:

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

(https://aws.amazon.com/blogs/security/how-to-restrict-amazon-s3-bucket-access-to-a-specific-iam-role/)

NEW QUESTION 54
- (Exam Topic 2)
An online retail company has more than 50 million active customers and receives more than 25,000 orders each day. The company collects purchase data for
customers and stores this data in Amazon S3. Additional customer data is stored in Amazon RDS.
The company wants to make all the data available to various teams so that the teams can perform analytics.
The solution must provide the ability to manage fine-grained permissions for the data and must minimize operational overhead.
Which solution will meet these requirements?

A. Migrate the purchase data to write directly to Amazon RD


B. Use RDS access controls to limit access.
C. Schedule an AWS Lambda function to periodically copy data from Amazon RDS to Amazon S3. Create an AWS Glue crawle
D. Use Amazon Athena to query the dat
E. Use S3 policies to limit access.
F. Create a data lake by using AWS Lake Formatio
G. Create an AWS Glue JDBC connection to Amazon RD
H. Register (he S3 bucket in Lake Formatio
I. Use Lake Formation access controls to limit access.
J. Create an Amazon Redshift cluste
K. Schedule an AWS Lambda function to periodically copy data from Amazon S3 and Amazon RDS to Amazon Redshif
L. Use Amazon Redshift access controls to limit access.

Answer: D

NEW QUESTION 55
- (Exam Topic 2)
A company wants to direct its users to a backup static error page if the company's primary website is unavailable. The primary website's DNS records are hosted in
Amazon Route 53. The domain is pointing to an Application Load Balancer (ALB). The company needs a solution that minimizes changes and infrastructure
overhead.
Which solution will meet these requirements?

A. Update the Route 53 records to use a latency routing polic


B. Add a static error page that is hosted in an Amazon S3 bucket to the records so that the traffic is sent to the most responsive endpoints.
C. Set up a Route 53 active-passive failover configuratio
D. Direct traffic to a static error page that is hosted in an Amazon S3 bucket when Route 53 health checks determine that the ALB endpoint is unhealthy.
E. Set up a Route 53 active-active configuration with the ALB and an Amazon EC2 instance that hosts a static error page as endpoint
F. Configure Route 53 to send requests to the instance only if the health checks fail for the ALB.
G. Update the Route 53 records to use a multivalue answer routing polic
H. Create a health chec
I. Direct traffic to the website if the health check passe
J. Direct traffic to a static error page that is hosted in Amazon S3 if the health check does not pass.

Answer: B

NEW QUESTION 60
- (Exam Topic 2)
A company runs its ecommerce application on AWS. Every new order is published as a message in a RabbitMQ queue that runs on an Amazon EC2 instance in a
single Availability Zone. These messages are processed by a different application that runs on a separate EC2 instance. This application stores the details in a
PostgreSQL database on another EC2 instance. All the EC2 instances are in the same Availability Zone.
The company needs to redesign its architecture to provide the highest availability with the least operational overhead.
What should a solutions architect do to meet these requirements?

A. Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon M


B. Create a Multi-AZ Auto Scaling group (or EC2 instances that host the applicatio
C. Create another Multi-AZAuto Scaling group for EC2 instances that host the PostgreSQL database.
D. Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon M
E. Create a Multi-AZ Auto Scaling group for EC2 instances that host the applicatio
F. Migrate the database to run ona Multi-AZ deployment of Amazon RDS for PostgreSQL.
G. Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queu
H. Create another Multi-AZ Auto Scaling group for EC2 instances that host the applicatio
I. Migrate the database to runon a Multi-AZ deployment of Amazon RDS fqjPostgreSQL.
J. Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queu
K. Create another Multi-AZ Auto Scaling group for EC2 instances that host the applicatio
L. Create a third Multi-AZ AutoScaling group for EC2 instances that host the PostgreSQL database.

Answer: B

NEW QUESTION 63
- (Exam Topic 2)
A company owns an asynchronous API that is used to ingest user requests and, based on the request type, dispatch requests to the appropriate microservice for
processing. The company is using Amazon API Gateway to deploy the API front end, and an AWS Lambda function that invokes Amazon DynamoDB to store user
requests before dispatching them to the processing microservices.
The company provisioned as much DynamoDB throughput as its budget allows, but the company is still experiencing availability issues and is losing user requests.
What should a solutions architect do to address this issue without impacting existing users?

A. Add throttling on the API Gateway with server-side throttling limits.


B. Use DynamoDB Accelerator (DAX) and Lambda to buffer writes to DynamoDB.
C. Create a secondary index in DynamoDB for the table with the user requests.

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

D. Use the Amazon Simple Queue Service (Amazon SQS) queue and Lambda to buffer writes to DynamoDB.

Answer: D

Explanation:
By using an SQS queue and Lambda, the solutions architect can decouple the API front end from the processing microservices and improve the overall scalability
and availability of the system. The SQS queue acts as a buffer, allowing the API front end to continue accepting user requests even if the processing microservices
are experiencing high workloads or are temporarily unavailable. The Lambda function can then retrieve requests from the SQS queue and write them to
DynamoDB, ensuring that all user requests are stored and processed. This approach allows the company to scale the processing microservices independently
from the API front end, ensuring that the API remains available to users even during periods of high demand.

NEW QUESTION 68
- (Exam Topic 2)
A company needs to save the results from a medical trial to an Amazon S3 repository. The repository must allow a few scientists to add new files and must restrict
all other users to read-only access. No users can have the ability to modify or delete any files in the repository. The company must keep every file in the repository
for a minimum of 1 year after its creation date.
Which solution will meet these requirements?

A. Use S3 Object Lock In governance mode with a legal hold of 1 year


B. Use S3 Object Lock in compliance mode with a retention period of 365 days.
C. Use an IAM role to restrict all users from deleting or changing objects in the S3 bucket Use an S3 bucket policy to only allow the IAM role
D. Configure the S3 bucket to invoke an AWS Lambda function every tune an object is added Configure the function to track the hash of the saved object to that
modified objects can be marked accordingly

Answer: C

NEW QUESTION 72
- (Exam Topic 2)
A solutions architect needs to help a company optimize the cost of running an application on AWS. The application will use Amazon EC2 instances, AWS Fargate,
and AWS Lambda for compute within the architecture.
The EC2 instances will run the data ingestion layer of the application. EC2 usage will be sporadic and unpredictable. Workloads that run on EC2 instances can be
interrupted at any time. The application front end will run on Fargate, and Lambda will serve the API layer. The front-end utilization and API layer utilization will be
predictable over the course of the next year.
Which combination of purchasing options will provide the MOST cost-effective solution for hosting this application? (Choose two.)

A. Use Spot Instances for the data ingestion layer


B. Use On-Demand Instances for the data ingestion layer
C. Purchase a 1-year Compute Savings Plan for the front end and API layer.
D. Purchase 1-year All Upfront Reserved instances for the data ingestion layer.
E. Purchase a 1-year EC2 instance Savings Plan for the front end and API layer.

Answer: AC

NEW QUESTION 77
- (Exam Topic 2)
A company has a legacy data processing application that runs on Amazon EC2 instances. Data is processed sequentially, but the order of results does not matter.
The application uses a monolithic architecture. The only way that the company can scale the application to meet increased demand is to increase the size of the
instances.
The company's developers have decided to rewrite the application to use a microservices architecture on Amazon Elastic Container Service (Amazon ECS).
What should a solutions architect recommend for communication between the microservices?

A. Create an Amazon Simple Queue Service (Amazon SQS) queu


B. Add code to the data producers, and send data to the queu
C. Add code to the data consumers to process data from the queue.
D. Create an Amazon Simple Notification Service (Amazon SNS) topi
E. Add code to the data producers, and publish notifications to the topi
F. Add code to the data consumers to subscribe to the topic.
G. Create an AWS Lambda function to pass message
H. Add code to the data producers to call the Lambda function with a data objec
I. Add code to the data consumers to receive a data object that is passed from the Lambda function.
J. Create an Amazon DynamoDB tabl
K. Enable DynamoDB Stream
L. Add code to the data producers to insert data into the tabl
M. Add code to the data consumers to use the DynamoDB Streams API to detect new table entries and retrieve the data.

Answer: A

Explanation:
Queue has Limited throughput (300 msg/s without batching, 3000 msg/s with batching whereby up-to 10 msg per batch operation; Msg duplicates not allowed in
the queue (exactly-once delivery); Msg order is preserved (FIFO); Queue name must end with .fifo

NEW QUESTION 79
- (Exam Topic 2)
A company is building a containerized application on premises and decides to move the application to AWS. The application will have thousands of users soon
after li is deployed. The company Is unsure how to manage the deployment of containers at scale. The company needs to deploy the containerized application in a
highly available architecture that minimizes operational overhead.
Which solution will meet these requirements?

A. Store container images In an Amazon Elastic Container Registry (Amazon ECR) repositor

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

B. Use an Amazon Elastic Container Service (Amazon ECS) cluster with the AWS Fargate launch type to run the container
C. Use target tracking to scale automatically based on demand.
D. Store container images in an Amazon Elastic Container Registry (Amazon ECR) repositor
E. Use an Amazon Elastic Container Service (Amazon ECS) cluster with the Amazon EC2 launch type to run the container
F. Use target tracking to scale automatically based on demand.
G. Store container images in a repository that runs on an Amazon EC2 instanc
H. Run the containers on EC2 instances that are spread across multiple Availability Zone
I. Monitor the average CPU utilization in Amazon CloudWatc
J. Launch new EC2 instances as needed
K. Create an Amazon EC2 Amazon Machine Image (AMI) that contains the container image Launch EC2 Instances in an Auto Scaling group across multiple
Availability Zone
L. Use an Amazon CloudWatch alarm to scale out EC2 instances when the average CPU utilization threshold is breached.

Answer: A

NEW QUESTION 80
- (Exam Topic 2)
A company is developing a file-sharing application that will use an Amazon S3 bucket for storage. The company wants to serve all the files through an Amazon
CloudFront distribution. The company does not want the files to be accessible through direct navigation to the S3 URL.
What should a solutions architect do to meet these requirements?

A. Write individual policies for each S3 bucket to grant read permission for only CloudFront access.
B. Create an IAM use
C. Grant the user read permission to objects in the S3 bucke
D. Assign the user to CloudFront.
E. Write an S3 bucket policy that assigns the CloudFront distribution ID as the Principal and assigns the target S3 bucket as the Amazon Resource Name (ARN).
F. Create an origin access identity (OAI). Assign the OAI to the CloudFront distributio
G. Configure the S3 bucket permissions so that only the OAI has read permission.

Answer: D

Explanation:
https://aws.amazon.com/premiumsupport/knowledge-center/cloudfront-access-to-amazon-s3/
https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-restricting-access-to-s3

NEW QUESTION 82
- (Exam Topic 2)
A company is planning to build a high performance computing (HPC) workload as a service solution that Is hosted on AWS A group of 16 AmazonEC2Ltnux
Instances requires the lowest possible latency for
node-to-node communication. The instances also need a shared block device volume for high-performing
storage.
Which solution will meet these requirements?

A. Use a duster placement grou


B. Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon E BS) volume to all the instances by using Amazon EBS Multi-Attach
C. Use a cluster placement grou
D. Create shared 'lie systems across the instances by using Amazon Elastic File System (Amazon EFS)
E. Use a partition placement grou
F. Create shared tile systems across the instances by using Amazon Elastic File System (Amazon EFS).
G. Use a spread placement grou
H. Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon EBS) volume to all the instances by using Amazon EBS Multi-Attach

Answer: A

NEW QUESTION 84
- (Exam Topic 2)
A company wants to migrate its on-premises data center to AWS. According to the company's compliance requirements, the company can use only the ap-
northeast-3 Region. Company administrators are not permitted to connect VPCs to the internet.
Which solutions will meet these requirements? (Choose two.)

A. Use AWS Control Tower to implement data residency guardrails to deny internet access and deny access to all AWS Regions except ap-northeast-3.
B. Use rules in AWS WAF to prevent internet acces
C. Deny access to all AWS Regions except ap-northeast-3 in the AWS account settings.
D. Use AWS Organizations to configure service control policies (SCPS) that prevent VPCs from gaining internet acces
E. Deny access to all AWS Regions except ap-northeast-3.
F. Create an outbound rule for the network ACL in each VPC to deny all traffic from 0.0.0.0/0. Create an IAM policy for each user to prevent the use of any AWS
Region other than ap-northeast-3.
G. Use AWS Config to activate managed rules to detect and alert for internet gateways and to detect and alert for new resources deployed outside of ap-
northeast-3.

Answer: AC

NEW QUESTION 88
- (Exam Topic 2)
A company hosts a website analytics application on a single Amazon EC2 On-Demand Instance. The analytics software is written in PHP and uses a MySQL
database. The analytics software, the web server that provides PHP, and the database server are all hosted on the EC2 instance. The application is showing signs
of performance degradation during busy times and is presenting 5xx errors. The company needs to make the application scale seamlessly.
Which solution will meet these requirements MOST cost-effectively?

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

A. Migrate the database to an Amazon RDS for MySQL DB instanc


B. Create an AMI of the web applicatio
C. Use the AMI to launch a second EC2 On-Demand Instanc
D. Use an Application Load Balancer to distribute the load to each EC2 instance.
E. Migrate the database to an Amazon RDS for MySQL DB instanc
F. Create an AMI of the web applicatio
G. Use the AMI to launch a second EC2 On-Demand Instanc
H. Use Amazon Route 53 weighted routing to distribute the load across the two EC2 instances.
I. Migrate the database to an Amazon Aurora MySQL DB instanc
J. Create an AWS Lambda function to stop the EC2 instance and change the instance typ
K. Create an Amazon CloudWatch alarm to invoke the Lambda function when CPU utilization surpasses 75%.
L. Migrate the database to an Amazon Aurora MySQL DB instanc
M. Create an AMI of the web application.Apply the AMI to a launch templat
N. Create an Auto Scaling group with the launch template Configure the launch template to use a Spot Flee
O. Attach an Application Load Balancer to the Auto Scaling group.

Answer: D

NEW QUESTION 90
- (Exam Topic 2)
An application runs on Amazon EC2 instances across multiple Availability Zones The instances run in an Amazon EC2 Auto Scaling group behind an Application
Load Balancer The application performs best when the CPU utilization of the EC2 instances is at or near 40%.
What should a solutions architect do to maintain the desired performance across all instances in the group?

A. Use a simple scaling policy to dynamically scale the Auto Scaling group
B. Use a target tracking policy to dynamically scale the Auto Scaling group
C. Use an AWS Lambda function to update the desired Auto Scaling group capacity.
D. Use scheduled scaling actions to scale up and scale down the Auto Scaling group

Answer: B

Explanation:
https://docs.aws.amazon.com/autoscaling/application/userguide/application-auto-scaling-target-tracking.html

NEW QUESTION 95
- (Exam Topic 2)
A medical records company is hosting an application on Amazon EC2 instances. The application processes customer data files that are stored on Amazon S3. The
EC2 instances are hosted in public subnets. The EC2 instances access Amazon S3 over the internet, but they do not require any other network access.
A new requirement mandates that the network traffic for file transfers take a private route and not be sent over the internet.
Which change to the network architecture should a solutions architect recommend to meet this requirement?

A. Create a NAT gatewa


B. Configure the route table for the public subnets to send traffic to Amazon S3 through the NAT gateway.
C. Configure the security group for the EC2 instances to restrict outbound traffic so that only traffic to the S3 prefix list is permitted.
D. Move the EC2 instances to private subnet
E. Create a VPC endpoint for Amazon S3, and link the endpoint to the route table for the private subnets
F. Remove the internet gateway from the VP
G. Set up an AWS Direct Connect connection, and route traffic to Amazon S3 over the Direct Connect connection.

Answer: C

NEW QUESTION 98
- (Exam Topic 2)
A company’s website provides users with downloadable historical performance reports. The website needs a solution that will scale to meet the company’s
website demands globally. The solution should be
cost-effective, limit the provisioning of infrastructure resources, and provide the fastest possible response time.
Which combination should a solutions architect recommend to meet these requirements?

A. Amazon CloudFront and Amazon S3


B. AWS Lambda and Amazon DynamoDB
C. Application Load Balancer with Amazon EC2 Auto Scaling
D. Amazon Route 53 with internal Application Load Balancers

Answer: A

Explanation:
Cloudfront for rapid response and s3 to minimize infrastructure.

NEW QUESTION 102


- (Exam Topic 2)
A company uses AWS Organizations to create dedicated AWS accounts for each business unit to manage each business unit's account independently upon
request. The root email recipient missed a notification that was sent to the root user email address of one account. The company wants to ensure that all future
notifications are not missed. Future notifications must be limited to account administrators.
Which solution will meet these requirements?

A. Configure the company's email server to forward notification email messages that are sent to the AWS account root user email address to all users in the
organization.
B. Configure all AWS account root user email addresses as distribution lists that go to a few administrators who can respond to alert
C. Configure AWS account alternate contacts in the AWS Organizations console or programmatically.

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

D. Configure all AWS account root user email messages to be sent to one administrator who is responsible for monitoring alerts and forwarding those alerts to the
appropriate groups.
E. Configure all existing AWS accounts and all newly created accounts to use the same root user email addres
F. Configure AWS account alternate contacts in the AWS Organizations console orprogrammatically.

Answer: D

NEW QUESTION 103


- (Exam Topic 2)
A company has implemented a self-managed DNS solution on three Amazon EC2 instances behind a Network Load Balancer (NLB) in the us-west-2 Region. Most
of the company's users are located in the United States and Europe. The company wants to improve the performance and availability of the solution. The company
launches and configures three EC2 instances in the eu-west-1 Region and adds the EC2 instances as targets for a new NLB.
Which solution can the company use to route traffic to all the EC2 instances?

A. Create an Amazon Route 53 geolocation routing policy to route requests to one of the two NLB
B. Create an Amazon CloudFront distributio
C. Use the Route 53 record as the distribution's origin.
D. Create a standard accelerator in AWS Global Accelerato
E. Create endpoint groups in us-west-2 and eu-west-1. Add the two NLBs as endpoints for the endpoint groups.
F. Attach Elastic IP addresses to the six EC2 instance
G. Create an Amazon Route 53 geolocation routing policy to route requests to one of the six EC2 instance
H. Create an Amazon CloudFront distributio
I. Usethe Route 53 record as the distribution's origin.
J. Replace the two NLBs with two Application Load Balancers (ALBs). Create an Amazon Route 53 latency routing policy to route requests to one of the two ALB
K. Create an Amazon CloudFront distributio
L. Use the Route 53 record as the distribution's origin.

Answer: B

Explanation:
For standard accelerators, Global Accelerator uses the AWS global network to route traffic to the optimal regional endpoint based on health, client location, and
policies that you configure, which increases the availability of your applications. Endpoints for standard accelerators can be Network Load Balancers, Application
Load Balancers, Amazon EC2 instances, or Elastic IP addresses that are located in one AWS Region or multiple Regions.
https://docs.aws.amazon.com/global-accelerator/latest/dg/what-is-global-accelerator.html

NEW QUESTION 104


- (Exam Topic 2)
A gaming company has a web application that displays scores. The application runs on Amazon EC2 instances behind an Application Load Balancer. The
application stores data in an Amazon RDS for MySQL database. Users are starting to experience long delays and interruptions that are caused by database read
performance. The company wants to improve the user experience while minimizing changes to the application's architecture.
What should a solutions architect do to meet these requirements?

A. Use Amazon ElastiCache in front of the database.


B. Use RDS Proxy between the application and the database.
C. Migrate the application from EC2 instances to AWS Lambda.
D. Migrate the database from Amazon RDS for MySQL to Amazon DynamoDB.

Answer: A

Explanation:
ElastiCache can help speed up the read performance of the database by caching frequently accessed data, reducing latency and allowing the application to
access the data more quickly. This solution requires minimal modifications to the current architecture, as ElastiCache can be used in conjunction with the existing
Amazon RDS for MySQL database.

NEW QUESTION 108


- (Exam Topic 2)
A company runs a production application on a fleet of Amazon EC2 instances. The application reads the data from an Amazon SQS queue and processes the
messages in parallel. The message volume is unpredictable and often has intermittent traffic. This application should continually process messages without any
downtime.
Which solution meets these requirements MOST cost-effectively?

A. Use Spot Instances exclusively to handle the maximum capacity required.


B. Use Reserved Instances exclusively to handle the maximum capacity required.
C. Use Reserved Instances for the baseline capacity and use Spot Instances to handle additional capacity.
D. Use Reserved Instances for the baseline capacity and use On-Demand Instances to handle additional capacity.

Answer: D

Explanation:
We recommend that you use On-Demand Instances for applications with short-term, irregular workloads that cannot be interrupted.
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-on-demand-instances.html

NEW QUESTION 109


- (Exam Topic 2)
A company runs an Oracle database on premises. As part of the company’s migration to AWS, the company wants to upgrade the database to the most recent
available version. The company also wants to set up disaster recovery (DR) for the database. The company needs to minimize the operational overhead for normal
operations and DR setup. The company also needs to maintain access to the database's underlying operating system.
Which solution will meet these requirements?

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

A. Migrate the Oracle database to an Amazon EC2 instanc


B. Set up database replication to a different AWS Region.
C. Migrate the Oracle database to Amazon RDS for Oracl
D. Activate Cross-Region automated backups to replicate the snapshots to another AWS Region.
E. Migrate the Oracle database to Amazon RDS Custom for Oracl
F. Create a read replica for the database in another AWS Region.
G. Migrate the Oracle database to Amazon RDS for Oracl
H. Create a standby database in another Availability Zone.

Answer: C

Explanation:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/rds-custom.html and https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/working-with-
custom-oracle.html

NEW QUESTION 114


- (Exam Topic 2)
A company wants to measure the effectiveness of its recent marketing campaigns. The company performs batch processing on csv files of sales data and stores
the results «i an Amazon S3 bucket once every hour. The S3 bi petabytes of objects. The company runs one-time queries in Amazon Athena to determine which
products are most popular on a particular date for a particular region Queries sometimes fail or take longer than expected to finish.
Which actions should a solutions architect take to improve the query performance and reliability? (Select TWO.)

A. Reduce the S3 object sizes to less than 126 MB


B. Partition the data by date and region n Amazon S3
C. Store the files as large, single objects in Amazon S3.
D. Use Amazon Kinosis Data Analytics to run the Queries as pan of the batch processing operation
E. Use an AWS duo extract, transform, and load (ETL) process to convert the csv files into Apache Parquet format.

Answer: CE

NEW QUESTION 119


- (Exam Topic 2)
A company wants to manage Amazon Machine Images (AMIs). The company currently copies AMIs to the same AWS Region where the AMIs were created. The
company needs to design an application that captures AWS API calls and sends alerts whenever the Amazon EC2 Createlmage API operation is called within the
company's account.
Which solution will meet these requirements with the LEAST operational overhead?

A. Create an AWS Lambda function to query AWS CloudTrail logs and to send an alert when a Createlmage API call is detected.
B. Configure AWS CloudTrail with an Amazon Simple Notification Service {Amazon SNS) notification that occurs when updated logs are sent to Amazon S3. Use
Amazon Athena to create a new table and to query on Createlmage when an API call is detected.
C. Create an Amazon EventBridge (Amazon CloudWatch Events) rule for the Createlmage API call.Configure the target as an Amazon Simple Notification Service
(Amazon SNS) topic to send an alert when a Createlmage API call is detected.
D. Configure an Amazon Simple Queue Service (Amazon SQS) FIFO queue as a target for AWS CloudTrail log
E. Create an AWS Lambda function to send an alert to an Amazon Simple Notification Service (Amazon SNS) topic when a Createlmage API call is detected.

Answer: B

NEW QUESTION 123


- (Exam Topic 2)
A company has a mulli-tier application that runs six front-end web servers in an Amazon EC2 Auto Scaling group in a single Availability Zone behind an Application
Load Balancer (ALB). A solutions architect needs lo modify the infrastructure to be highly available without modifying the application.
Which architecture should the solutions architect choose that provides high availability?

A. Create an Auto Scaling group that uses three Instances across each of tv/o Regions.
B. Modify the Auto Scaling group to use three instances across each of two Availability Zones.
C. Create an Auto Scaling template that can be used to quickly create more instances in another Region.
D. Change the ALB in front of the Amazon EC2 instances in a round-robin configuration to balance traffic to the web tier.

Answer: B

Explanation:
High availability can be enabled for this architecture quite simply by modifying the existing Auto Scaling group to use multiple availability zones. The ASG will
automatically balance the load so you don't actually need to specify the instances per AZ.

NEW QUESTION 128


- (Exam Topic 2)
A company runs a stateless web application in production on a group of Amazon EC2 On-Demand Instances behind an Application Load Balancer. The application
experiences heavy usage during an 8-hour period each business day. Application usage is moderate and steady overnight Application usage is low during
weekends.
The company wants to minimize its EC2 costs without affecting the availability of the application. Which solution will meet these requirements?

A. Use Spot Instances for the entire workload.


B. Use Reserved instances for the baseline level of usage Use Spot Instances for any additional capacity that the application needs.
C. Use On-Demand Instances for the baseline level of usag
D. Use Spot Instances for any additional capacity that the application needs
E. Use Dedicated Instances for the baseline level of usag
F. Use On-Demand Instances for any additional capacity that the application needs

Answer: B

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

NEW QUESTION 130


- (Exam Topic 3)
An ecommerce company needs to run a scheduled daily job to aggregate and filler sales records for analytics. The company stores the sales records in an
Amazon S3 bucket. Each object can be up to 10 G6 in size Based on the number of sales events, the job can take up to an hour to complete. The CPU and
memory usage of the fob are constant and are known in advance.
A solutions architect needs to minimize the amount of operational effort that is needed for the job to run. Which solution meets these requirements?

A. Create an AWS Lambda function that has an Amazon EventBridge notification Schedule the EventBridge event to run once a day
B. Create an AWS Lambda function Create an Amazon API Gateway HTTP API, and integrate the API with the function Create an Amazon EventBridge scheduled
avert that calls the API and invokes the function.
C. Create an Amazon Elastic Container Service (Amazon ECS) duster with an AWS Fargate launch type.Create an Amazon EventBridge scheduled event that
launches an ECS task on the cluster to run the job.
D. Create an Amazon Elastic Container Service (Amazon ECS) duster with an Amazon EC2 launch type and an Auto Scaling group with at least one EC2 instanc
E. Create an Amazon EventBridge scheduled event that launches an ECS task on the duster to run the job.

Answer: C

NEW QUESTION 133


- (Exam Topic 3)
A company uses a payment processing system that requires messages for a particular payment ID to be received in the same order that they were sent Otherwise,
the payments might be processed incorrectly.
Which actions should a solutions architect take to meet this requirement? (Select TWO.)

A. Write the messages to an Amazon DynamoDB table with the payment ID as the partition key
B. Write the messages to an Amazon Kinesis data stream with the payment ID as the partition key.
C. Write the messages to an Amazon ElastiCache for Memcached cluster with the payment ID as the key
D. Write the messages to an Amazon Simple Queue Service (Amazon SQS) queue Set the message attribute to use the payment ID
E. Write the messages to an Amazon Simple Queue Service (Amazon SQS) FIFO queu
F. Set the message group to use the payment ID.

Answer: AE

NEW QUESTION 138


- (Exam Topic 3)
A company wants to create an application to store employee data in a hierarchical structured relationship. The company needs a minimum-latency response to
high-traffic queries for the employee data and must protect any sensitive data. The company also need to receive monthly email messages if any financial
information is present in the employee data.
Which combination of steps should a solutin architect take to meet these requirement? ( Select TWO.)

A. Use Amazon Redshift to store the employee data in hierarchie


B. Unload the data to Amazon S3 every month.
C. Use Amazon DynamoDB to store the employee data in hierarchies Export the data to Amazon S3 every month.
D. Configure Amazon Macie for the AWS account Integrate Macie with Amazon EventBridge to send monthly events to AWS Lambda.
E. Use Amazon Athena to analyze the employee data in Amazon S3 integrate Athena with Amazon QuickSight to publish analysis dashboards and share the
dashboards with users.
F. Configure Amazon Macie for the AWS accoun
G. integrate Macie with Amazon EventBridge to send monthly notifications through an Amazon Simple Notification Service (Amazon SNS) subscription.

Answer: BE

NEW QUESTION 142


- (Exam Topic 3)
An ecommerce company is building a distributed application that involves several serverless functions and AWS services to complete order-processing tasks.
These tasks require manual approvals as part of the workflow A solutions architect needs to design an architecture for the order-processing application The
solution must be able to combine multiple AWS Lambda functions into responsive serverless applications The solution also must orchestrate data and services
that run on Amazon EC2 instances, containers, or on-premises servers
Which solution will meet these requirements with the LEAST operational overhead?

A. Use AWS Step Functions to build the application.


B. Integrate all the application components in an AWS Glue job
C. Use Amazon Simple Queue Service (Amazon SQS) to build the application
D. Use AWS Lambda functions and Amazon EventBridge (Amazon CloudWatch Events) events to build the application

Answer: D

NEW QUESTION 143


- (Exam Topic 3)
A company deploys an appliation on five Amazon EC2 instances. An Applicatin Load Balancer (ALB) distributes traffic to the instances by using a target group.
The average CPU usage on each of the insatances is below 10% most of the time. With occasional surges to 65%.
A solution architect needs to implement a solution to automate the scalability of the application. The solution must optimize the cost of the architecture and must
ensure that the application has enough CPU resources when surges occur.
Which solution will meet these requirements?

A. Create an Amazon CloudWatch alarm that enters the ALARM state when the CPUUtilization metric is less than 20%. Create an AWS Lambda function that the
CloudWatch alarm invokes to terminate one ofthe EC2 instances in the ALB target group.
B. Create an EC2 Auto Scalin
C. Select the exisiting ALB as the load balancer and the existing target group as the target grou
D. Set a target tracking scaling policy that is based on the ASGAverageCPUUtilization metri

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

E. Set the minimum instances to 2, the desired capacity to 3, the desired capacity to 3, the maximum instances to 6, and the target value to 50%. And the EC2
instances to the Auto Scaling group.
F. Create an EC2 Auto Scalin
G. Select the exisiting ALB as the load balancer and the existing target group.Set the minimum instances to 2, the desired capacity to 3, and the maximum
instances to 6 Add the EC2 instances to the Scaling group.
H. Create two Amazon CloudWatch alarm
I. Configure the first CloudWatch alarm to enter the ALARM satet when the average CPUTUilization metric is below 20%. Configure the seconnd CloudWatch
alarm to enter the ALARM state when the average CPUUtilization metric is aboove 50%. Configure the alarms to publish to an Amazon Simple Notification Service
(Amazon SNS) topic to send an email messag
J. After receiving the message, log in to decrease or increase the number of EC2 instances that are running

Answer: B

NEW QUESTION 145


- (Exam Topic 3)
A company has an API that receives real-time data from a fleet of monitoring devices. The API stores this data in an Amazon RDS DB instance for later analysis.
The amount of data that the monitoring devices send to the API fluctuates. During periods of heavy traffic, the API often returns timeout errors.
After an inspection of the logs, the company determines that the database is not capable of processing the volume of write traffic that comes from the API. A
solutions architect must minimize the number of
connections to the database and must ensure that data is not lost during periods of heavy traffic. Which solution will meet these requirements?

A. Increase the size of the DB instance to an instance type that has more available memory.
B. Modify the DB instance to be a Multi-AZ DB instanc
C. Configure the application to write to all active RDS DB instances.
D. Modify the API to write incoming data to an Amazon Simple Queue Service (Amazon SQS) queu
E. Use an AWS Lambda function that Amazon SQS invokes to write data from the queue to the database.
F. Modify the API to write incoming data to an Amazon Simple Notification Service (Amazon SNS) topic.Use an AWS Lambda function that Amazon SNS invokes
to write data from the topic to the database.

Answer: C

Explanation:
Using Amazon SQS will help minimize the number of connections to the database, as the API will write data to a queue instead of directly to the database.
Additionally, using an AWS Lambda function that Amazon SQS invokes to write data from the queue to the database will help ensure that data is not lost during
periods of heavy traffic, as the queue will serve as a buffer between the API and the database.

NEW QUESTION 150


- (Exam Topic 3)
A company wants to migrate an Oracle database to AWS. The database consists of a single table that contains millions of geographic information systems (GIS)
images that are high resolution and are identified by a geographic code.
When a natural disaster occurs tens of thousands of images get updated every few minutes. Each geographic code has a single image or row that is associated
with it. The company wants a solution that is highly available and scalable during such events
Which solution meets these requirements MOST cost-effectively?

A. Store the images and geographic codes in a database table Use Oracle running on an Amazon RDS Multi-AZ DB instance
B. Store the images in Amazon S3 buckets Use Amazon DynamoDB with the geographic code as the key and the image S3 URL as the value
C. Store the images and geographic codes in an Amazon DynamoDB table Configure DynamoDB Accelerator (DAX) during times of high load
D. Store the images in Amazon S3 buckets Store geographic codes and image S3 URLs in a database table Use Oracle running on an Amazon RDS Multi-AZ DB
instance.

Answer: A

NEW QUESTION 152


- (Exam Topic 3)
A company has a web application that is based on Java and PHP The company plans to move the application from on premises to AWS The company needs the
ability to test new site features frequently. The company also needs a highly available and managed solution that requires minimum operational overhead
Which solution will meet these requirements?

A. Create an Amazon S3 bucket Enable static web hosting on the S3 bucket Upload the static content to the S3 bucket Use AWS Lambda to process all dynamic
content
B. Deploy the web application to an AWS Elastic Beanstalk environment Use URL swapping to switch between multiple Elastic Beanstalk environments for feature
testing
C. Deploy the web application lo Amazon EC2 instances that are configured with Java and PHP Use Auto Scaling groups and an Application Load Balancer to
manage the website's availability
D. Containerize the web application Deploy the web application to Amazon EC2 instances Use the AWS Load Balancer Controller to dynamically route traffic
between containers thai contain the new site features for testing

Answer: B

NEW QUESTION 153


- (Exam Topic 3)
A company runs a containerized application on a Kubernetes cluster in an on-premises data center. The company is using a MongoDB database for data storage.
The company wants to migrate some of these environments to AWS, but no code changes or deployment method changes are possible at this time. The company
needs a solution that minimizes operational overhead.
Which solution meets these requirements?

A. Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 worker nodes for compute and MongoDB on EC2 for data storage.
B. Use Amazon Elastic Container Service (Amazon ECS) with AWS Fargate for compute and Amazon DynamoDB for data storage.
C. Use Amazon Elastic Kubernetes Service (Amazon EKS) with Amazon EC2 worker nodes for compute and Amazon DynamoDB for data storage.

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

D. Use Amazon Elastic Kubernetes Service (Amazon EKS) with AWS Fargate for compute and Amazon DocumentDB (with MongoDB compatibility) for data
storage.

Answer: D

Explanation:
Amazon DocumentDB (with MongoDB compatibility) is a fast, reliable, and fully managed database service. Amazon DocumentDB makes it easy to set up,
operate, and scale MongoDB-compatible databases in the cloud. With Amazon DocumentDB, you can run the same application code and use the same drivers
and tools that you use with MongoDB.
https://docs.aws.amazon.com/documentdb/latest/developerguide/what-is.html

NEW QUESTION 156


- (Exam Topic 3)
A company is using Amazon CloudFront with this website. The company has enabled logging on the CloudFront distribution, and logs are saved in one of the
company's Amazon S3 buckets The company needs to perform advanced analyses on the logs and build visualizations
What should a solutions architect do to meet these requirements'?

A. Use standard SQL queries in Amazon Athena to analyze the CloudFront togs in the S3 bucket Visualize the results with AWS Glue
B. Use standard SQL queries in Amazon Athena to analyze the CloudFront togs in the S3 bucket Visualize the results with Amazon QuickSight
C. Use standard SQL queries in Amazon DynamoDB to analyze the CloudFront logs m the S3 bucket Visualize the results with AWS Glue
D. Use standard SQL queries in Amazon DynamoDB to analyze the CtoudFront logs m the S3 bucket Visualize the results with Amazon QuickSight

Answer: D

NEW QUESTION 158


- (Exam Topic 3)
A financial company hosts a web application on AWS. The application uses an Amazon API Gateway Regional API endpoint to give users the ability to retrieve
current stock prices. The company's security team has noticed an increase in the number of API requests. The security team is concerned that HTTP flood attacks
might take the application offline.
A solutions architect must design a solution to protect the application from this type of attack. Which solution meats these requirements with the LEAST operational
overhead?

A. Create an Amazon CloudFront distribution in front of the API Gateway Regional API endpoint with a maximum TTL of 24 hours
B. Create a Regional AWS WAF web ACL with a rate-based rul
C. Associate the web ACL with the API Gateway stage.
D. Use Amazon CloudWatch metrics to monitor the Count metric and alert the security team when the predefined rate is reached
E. Create an Amazon CloudFront distribution with Lambda@Edge in front of the API Gateway Regional API endpoint Create an AWS Lambda function to block
requests from IP addresses that exceed the predefined rate.

Answer: B

NEW QUESTION 162


- (Exam Topic 3)
A company needs to provide its employee with secure access to confidential and sensitive files. The company wants to ensure that the files can be accessed only
by authorized users. The files must be downloaded security to the employees devices.
The files are stored in an on-premises Windows files server. However, due to an increase in remote usage, the file server out of capacity.
Which solution will meet these requirement?

A. Migrate the file server to an Amazon EC2 instance in a public subne


B. Configure the security group to limit inbound traffic to the employees ‚IP addresses.
C. Migrate the files to an Amazon FSx for Windows File Server file syste
D. Integrate the Amazon FSx file system with the on-premises Active Directory Configure AWS Client VPN.
E. Migrate the files to Amazon S3, and create a private VPC endpoin
F. Create a signed URL to allow download.
G. Migrate the files to Amazon S3, and create a public VPC endpoint Allow employees to sign on with AWS IAM identity Center (AWS Sing-On).

Answer: C

NEW QUESTION 167


- (Exam Topic 3)
A company wants to configure its Amazon CloudFront distribution to use SSL/TLS certificates. The company does not want to use the default domain name for the
distribution. Instead, the company wants to use a different domain name for the distribution.
Which solution will deploy the certificate with icurring any additional costs?

A. Request an Amazon issued private certificate from AWS Certificate Manager (ACM) in the us-east-1 Region
B. Request an Amazon issued private certificate from AWS Certificate Manager (ACM) in the us-west-1 Region.
C. Request an Amazon issued public certificate from AWS Certificate Manager (ACU) in the us-east-1 Region
D. Request an Amazon issued public certificate from AWS Certificate Manager (ACU) in the us-west-1 Regon.

Answer: B

NEW QUESTION 170


- (Exam Topic 3)
A company is designing a shared storage solution for a gaming application that is hosted in the AWS Cloud The company needs the ability to use SMB clients to
access data solution must be fully managed.
Which AWS solution meets these requirements?

A. Create an AWS DataSync task that shares the data as a mountable file system Mount the file system to the application server

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

B. Create an Amazon EC2 Windows instance Install and configure a Windows file share role on the instance Connect the application server to the file share
C. Create an Amazon FSx for Windows File Server file system Attach the file system to the origin server Connect the application server to the file system
D. Create an Amazon S3 bucket Assign an IAM role to the application to grant access to the S3 bucket Mount the S3 bucket to the application server

Answer: C

NEW QUESTION 175


- (Exam Topic 3)
A media company hosts its website on AWS. The website application's architecture includes a fleet of Amazon EC2 instances behind an Application Load Balancer
(ALB) and a database that is hosted on Amazon Aurora The company's cyber security teem reports that the application is vulnerable to SOL injection.
How should the company resolve this issue?

A. Use AWS WAF in front of the ALB Associate the appropriate web ACLs with AWS WAF.
B. Create an ALB listener rule to reply to SQL injection with a fixed response
C. Subscribe to AWS Shield Advanced to block all SQL injection attempts automatically.
D. Set up Amazon Inspector to block all SOL injection attempts automatically

Answer: A

NEW QUESTION 176


- (Exam Topic 3)
A solutions architect needs to design a highly available application consisting of web, application, and database tiers. HTTPS content delivery should be as close
to the edge as possible, with the least delivery time.
Which solution meets these requirements and is MOST secure?

A. Configure a public Application Load Balancer (ALB) with multiple redundant Amazon EC2 instances in public subnet
B. Configure Amazon CloudFront to deliver HTTPS content using the public ALB as the origin.
C. Configure a public Application Load Balancer with multiple redundant Amazon EC2 instances in private subnet
D. Configure Amazon CloudFront to deliver HTTPS content using the EC2 instances as the origin.
E. Configure a public Application Load Balancer (ALB) with multiple redundant Amazon EC2 instances in private subnet
F. Configure Amazon CloudFront to deliver HTTPS content using the public ALB as the origin.
G. Configure a public Application Load Balancer with multiple redundant Amazon EC2 instances in public subnet
H. Configure Amazon CloudFront to deliver HTTPS content using the EC2 instances as the origin.

Answer: C

Explanation:
This solution meets the requirements for a highly available application with web, application, and database tiers, as well as providing edge-based content delivery.
Additionally, it maximizes security by having the ALB in a private subnet, which limits direct access to the web servers, while still being able to serve traffic over the
Internet via the public ALB. This will ensure that the web servers are not exposed to the public Internet, which reduces the attack surface and provides a secure
way to access the application.

NEW QUESTION 180


- (Exam Topic 3)
A company is building an application that consists of several microservices. The company has decided to use container technologies to deploy its software on
AWS. The company needs a solution that minimizes the amount of ongoing effort for maintenance and scaling. The company cannot manage additional
infrastructure.
Which combination of actions should a solutions architect take to meet these requirements? (Choose two.)

A. Deploy an Amazon Elastic Container Service (Amazon ECS) cluster.


B. Deploy the Kubernetes control plane on Amazon EC2 instances that span multiple Availability Zones.
C. Deploy an Amazon Elastic Container Service (Amazon ECS) service with an Amazon EC2 launch type.Specify a desired task number level of greater than or
equal to 2.
D. Deploy an Amazon Elastic Container Service (Amazon ECS) service with a Fargate launch type.Specify a desired task number level of greater than or equal to
2.
E. Deploy Kubernetes worker nodes on Amazon EC2 instances that span multiple Availability Zones.Create a deployment that specifies two or more replicas for
each microservice.

Answer: AD

Explanation:
AWS Fargate is a technology that you can use with Amazon ECS to run containers without having to manage servers or clusters of Amazon EC2 instances. With
Fargate, you no longer have to provision, configure, or scale clusters of virtual machines to run containers.
https://docs.aws.amazon.com/AmazonECS/latest/userguide/what-is-fargate.html

NEW QUESTION 182


- (Exam Topic 3)
A company is running a multi-tier recommence web application in the AWS Cloud. The application runs on Amazon EC2 instances with an Amazon RDS for
MySQL Multi-AZ OB instance. Amazon ROS is configured with the latest generation DB instance with 2.000 GB of storage In a General Purpose SSD (gp3)
Amazon Elastic Block Store (Amazon EBSl volume. The database performance affects the application during periods high demand.
A database administrator analyzes the logs in Amazon CloudWatch Logs and discovers that the application performance always degrades when the number of
read and write IOPS is higher than 20.000.
What should a solutions architect do to improve the application performance?

A. Replace the volume with a magnetic volume.


B. Increase the number of IOPS on the gp3 volume.
C. Replace the volume with a Provisioned IOPS SSD (Io2) volume.
D. Replace the 2.000 GB gp3 volume with two 1.000 GB gp3 volumes

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

Answer: C

NEW QUESTION 187


- (Exam Topic 3)
A company's web application consists o( an Amazon API Gateway API in front of an AWS Lambda function and an Amazon DynamoDB database. The Lambda
function
handles the business logic, and the DynamoDB table hosts the data. The application uses Amazon Cognito user pools to identify the individual users of the
application. A solutions architect needs to update the application so that only users who have a subscription can access premium content.

A. Enable API caching and throttling on the API Gateway API


B. Set up AWS WAF on the API Gateway API Create a rule to filter users who have a subscription
C. Apply fine-grained IAM permissions to the premium content in the DynamoDB table
D. Implement API usage plans and API keys to limit the access of users who do not have a subscription.

Answer: C

NEW QUESTION 189


- (Exam Topic 3)
A company is moving its on-premises Oracle database to Amazon Aurora PostgreSQL. The database has several applications that write to the same tables. The
applications need to be migrated one by one with a month in between each migration. Management has expressed concerns that the database has a high number
of reads and writes. The data must be kept in sync across both databases throughout the migration.
What should a solutions architect recommend?

A. Use AWS DataSync for the initial migratio


B. Use AWS Database Migration Service (AWS DMS) to create a change data capture (CDC) replication task and a table mapping to select all tables.
C. Use AWS DataSync for the initial migratio
D. Use AWS Database Migration Service (AWS DMS) to create a full load plus change data capture (CDC) replication task and a table mapping to select all tables.
E. Use the AWS Schema Conversion Tool with AWS Database Migration Service (AWS DMS) using amemory optimized replication instanc
F. Create a full load plus change data capture (CDC) replication task and a table mapping to select all tables.
G. Use the AWS Schema Conversion Tool with AWS Database Migration Service (AWS DMS) using a compute optimized replication instanc
H. Create a full load plus change data capture (CDC) replication task and a table mapping to select the largest tables.

Answer: C

NEW QUESTION 191


- (Exam Topic 3)
A company hostss a three application on Amazon EC2 instances in a single Availability Zone. The web application uses a self-managed MySQL database that is
hosted on an EC2 instances to store data in an Amazon Elastic Block Store (Amazon EBS) volumn. The MySQL database currently uses a 1 TB Provisioned IOPS
SSD (io2) EBS volume. The company expects traffic of 1,000 IOPS for both reads and writes at peak traffic.
The company wants to minimize any distruptions, stabilize perperformace, and reduce costs while retaining the capacity for double the IOPS. The company wants
to more the database tier to a fully managed solution that is highly available and fault tolerant.
Which solution will meet these requirements MOST cost-effectively?

A. Use a Multi-AZ deployment of an Amazon RDS for MySQL DB instance with an io2 Block Express EBS volume.
B. Use a Multi-AZ deployment of an Amazon RDS for MySQL DB instance with a General Purpose SSD (gp2) EBS volume.
C. Use Amazon S3 Intelligent-Tiering access tiers.
D. Use two large EC2 instances to host the database in active-passive mode.

Answer: A

NEW QUESTION 192


- (Exam Topic 3)
A company needs a backup strategy for its three-tier stateless web application The web application runs on Amazon EC2 instances in an Auto Scaling group with
a dynamic scaling policy that is configured to respond to scaling events The database tier runs on Amazon RDS for PostgreSQL The web application does not
require temporary local storage on the EC2 instances The company's recovery point objective (RPO) is 2 hours
The backup strategy must maximize scalability and optimize resource utilization for this environment Which solution will meet these requirements?

A. Take snapshots of Amazon Elastic Block Store (Amazon EBS) volumes of the EC2 instances and database every 2 hours to meet the RPO
B. Configure a snapshot lifecycle policy to take Amazon Elastic Block Store (Amazon EBS) snapshots Enable automated backups in Amazon RDS to meet the
RPO
C. Retain the latest Amazon Machine Images (AMIs) of the web and application tiers Enable automated backups in Amazon RDS and use point-in-time recovery to
meet the RPO
D. Take snapshots of Amazon Elastic Block Store (Amazon EBS) volumes of the EC2 instances every 2 hours Enable automated backups in Amazon RDS and
use point-in-time recovery to meet the RPO

Answer: D

NEW QUESTION 197


- (Exam Topic 3)
A company hosts a web application on multiple Amazon EC2 instances The EC2 instances are in an Auto Scaling group that scales in response to user demand
The company wants to optimize cost savings without making a long-term commitment
Which EC2 instance purchasing option should a solutions architect recommend to meet these requirements'?

A. Dedicated Instances only


B. On-Demand Instances only
C. A mix of On-Demand instances and Spot Instances
D. A mix of On-Demand instances and Reserved instances

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

Answer: A

NEW QUESTION 199


- (Exam Topic 3)
A company is developing a marketing communications service that targets mobile app users. The company needs to send confirmation messages with Short
Message Service (SMS) to its users. The users must be able to reply to the SMS messages. The company must store the responses for a year for analysis.
What should a solutions architect do to meet these requirements?

A. Create an Amazon Connect contact flow to send the SMS message


B. Use AWS Lambda to process the responses.
C. Build an Amazon Pinpoint journe
D. Configure Amazon Pinpoint to send events to an Amazon Kinesis data stream for analysis and archiving.
E. Use Amazon Simple Queue Service (Amazon SQS) to distribute the SMS message
F. Use AWS Lambda to process the responses.
G. Create an Amazon Simple Notification Service (Amazon SNS) FIFO topi
H. Subscribe an Amazon Kinesis data stream to the SNS topic for analysis and archiving.

Answer: B

Explanation:
https://aws.amazon.com/pinpoint/product-details/sms/ Two-Way Messaging: Receive SMS messages from your customers and reply back to them in a chat-like
interactive experience. With Amazon Pinpoint, you can create automatic responses when customers send you messages that contain certain keywords. You can
even use Amazon Lex to create conversational bots. A majority of mobile phone users read incoming SMS messages almost immediately after receiving them. If
you need to be able to provide your customers with urgent or important information, SMS messaging may be the right solution for you. You can use Amazon
Pinpoint to create targeted groups of customers, and then send them campaign-based messages. You can also use Amazon Pinpoint to send direct messages,
such as appointment confirmations, order updates, and one-time passwords.

NEW QUESTION 202


- (Exam Topic 3)
A company previously migrated its data warehouse solution to AWS. The company also has an AWS Direct Connect connection. Corporate office users query the
data warehouse using a visualization tool. The average size of a query returned by the data warehouse is 50 MB and each webpage sent by the visualization tool
is approximately 500 KB. Result sets returned by the data warehouse are not cached.
Which solution provides the LOWEST data transfer egress cost for the company?

A. Host the visualization tool on premises and query the data warehouse directly over the internet.
B. Host the visualization tool in the same AWS Region as the data warehous
C. Access it over the internet.
D. Host the visualization tool on premises and query the data warehouse directly over a Direct Connect connection at a location in the same AWS Region.
E. Host the visualization tool in the same AWS Region as the data warehouse and access it over a Direct Connect connection at a location in the same Region.

Answer: D

Explanation:
https://aws.amazon.com/directconnect/pricing/ https://aws.amazon.com/blogs/aws/aws-data-transfer-prices-reduced/

NEW QUESTION 203


- (Exam Topic 3)
A company has implemented a self-managed DNS service on AWS. The solution consists of the following:
• Amazon EC2 instances in different AWS Regions
• Endpomts of a standard accelerator m AWS Global Accelerator
The company wants to protect the solution against DDoS attacks What should a solutions architect do to meet this requirement?

A. Subscribe to AWS Shield Advanced Add the accelerator as a resource to protect


B. Subscribe to AWS Shield Advanced Add the EC2 instances as resources to protect
C. Create an AWS WAF web ACL that includes a rate-based rule Associate the web ACL with the accelerator
D. Create an AWS WAF web ACL that includes a rate-based rule Associate the web ACL with the EC2 instances

Answer: B

NEW QUESTION 206


- (Exam Topic 3)
A company is building a mobile app on AWS. The company wants to expand its reach to millions of users The company needs to build a platform so that
authorized users can watch the company's content on their mobile devices
What should a solutions architect recommend to meet these requirements?

A. Publish content to a public Amazon S3 bucke


B. Use AWS Key Management Service (AWS KMS) keys to stream content.
C. Set up IPsec VPN between the mobile app and the AWS environment to stream content
D. Use Amazon CloudFront Provide signed URLs to stream content.
E. Set up AWS Client VPN between the mobile app and the AWS environment to stream content.

Answer: A

NEW QUESTION 207


- (Exam Topic 3)
A medical research lab produces data that is related to a new study. The lab wants to make the data available with minimum latency to clinics across the country
for their on-premises, file-based applications. The data files are stored in an Amazon S3 bucket that has read-only permissions for each clinic.
What should a solutions architect recommend to meet these requirements?

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

A. Deploy an AWS Storage Gateway file gateway as a virtual machine (VM) on premises at each clinic
B. Migrate the files to each clinic’s on-premises applications by using AWS DataSync for processing.
C. Deploy an AWS Storage Gateway volume gateway as a virtual machine (VM) on premises at each clinic.
D. Attach an Amazon Elastic File System (Amazon EFS) file system to each clinic’s on-premises servers.

Answer: A

Explanation:
AWS Storage Gateway is a service that connects an on-premises software appliance with cloud-based storage to provide seamless and secure integration
between an organization's on-premises IT environment and AWS's storage infrastructure. By deploying a file gateway as a virtual machine on each clinic's
premises, the medical research lab can provide low-latency access to the data stored in the S3 bucket while maintaining read-only permissions for each clinic. This
solution allows the clinics to access the data files directly from their
on-premises file-based applications without the need for data transfer or migration.

NEW QUESTION 208


- (Exam Topic 3)
A company is hosting a web application from an Amazon S3 bucket. The application uses Amazon Cognito as an identity provider lo authenticate users and return
a JSON Web Token (JWT) that provides access to protected resources that am restored in another S3 bucket.
Upon deployment of the application, users report errors and are unable to access the protected content. A solutions architect must resolve this issue by providing
proper permissions so that users can access the protected content.
Which solution meets these requirements?

A. Update the Amazon Cognito identity pool to assume the proper IAM role for access to the protected consent.
B. Update the S3 ACL to allow the application to access the protected content
C. Redeploy the application to Amazon 33 to prevent eventually consistent reads m the S3 bucket from affecting the ability of users to access the protected
content.
D. Update the Amazon Cognito pool to use custom attribute mappings within tie Identity pool and grant users the proper permissions to access the protected
content

Answer: B

NEW QUESTION 211


- (Exam Topic 3)
A solutions architect has created a new AWS account and must secure AWS account root user access. Which combination of actions will accomplish this?
(Choose two.)

A. Ensure the root user uses a strong password.


B. Enable multi-factor authentication to the root user.
C. Store root user access keys in an encrypted Amazon S3 bucket.
D. Add the root user to a group containing administrative permissions.
E. Apply the required permissions to the root user with an inline policy document.

Answer: AB

NEW QUESTION 216


- (Exam Topic 3)
A company runs a web application that is backed by Amazon RDS. A new database administrator caused data loss by accidentally editing information in a
database table To help recover from this type of incident, the company wants the ability to restore the database to its state from 5 minutes before any change
within the last 30 days.
Which feature should the solutions architect include in the design to meet this requirement?

A. Read replicas
B. Manual snapshots
C. Automated backups
D. Multi-AZ deployments

Answer: C

NEW QUESTION 220


- (Exam Topic 3)
A company is using a content management system that runs on a single Amazon EC2 instance. The EC2 instance contains both the web server and the database
software. The company must make its website platform highly available and must enable the website to scale to meet user demand. What should a solutions
architect recommend to meet these requirements?

A. Move the database to Amazon RDS, and enable automatic backup


B. Manually launch another EC2 instance in the same Availability Zon
C. Configure an Application Load Balancer in the Availability Zone, and set the two instances as targets.
D. Migrate the database to an Amazon Aurora instance with a read replica in the same Availability Zone as the existing EC2 instanc
E. Manually launch another EC2 instance in the same Availability Zon
F. Configure an Application Load Balancer, and set the two EC2 instances as targets.
G. Move the database to Amazon Aurora with a read replica in another Availability Zon
H. Create an Amazon Machine Image (AMI) from the EC2 instanc
I. Configure an Application Load Balancer in two Availability Zone
J. Attach an Auto Scaling group that uses the AMI across two Availability Zones.
K. Move the database to a separate EC2 instance, and schedule backups to Amazon S3. Create an Amazon Machine Image (AMI) from the original EC2 instanc
L. Configure an Application Load Balancer in two Availability Zone
M. Attach an Auto Scaling group that uses the AMI across two Availability Zones.

Answer: C

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

Explanation:
This approach will provide both high availability and scalability for the website platform. By moving the database to Amazon Aurora with a read replica in another
availability zone, it will provide a failover option for the database. The use of an Application Load Balancer and an Auto Scaling group across two availability zones
allows for automatic scaling of the website to meet increased user demand. Additionally, creating an AMI from the original EC2 instance allows for easy replication
of the instance in case of failure.

NEW QUESTION 222


- (Exam Topic 3)
A company has an application thai runs on several Amazon EC2 instances Each EC2 instance has multiple Amazon Elastic Block Store (Amazon EBS) data
volumes attached to it The application's EC2 instance configuration and data need to be backed up nightly The application also needs to be recoverable in a
different AWS Region
Which solution will meet these requirements in the MOST operationally efficient way?

A. Write an AWS Lambda function that schedules nightly snapshots of the application's EBS volumes and copies the snapshots to a different Region
B. Create a backup plan by using AWS Backup to perform nightly backup
C. Copy the backups to another Region Add the application's EC2 instances as resources
D. Create a backup plan by using AWS Backup to perform nightly backups Copy the backups to another Region Add the application's EBS volumes as resources
E. Write an AWS Lambda function that schedules nightly snapshots of the application's EBS volumes and copies the snapshots to a different Availability Zone

Answer: B

Explanation:
The most operationally efficient solution to meet these requirements would be to create a backup plan by using AWS Backup to perform nightly backups and
copying the backups to another Region. Adding the application's EBS volumes as resources will ensure that the application's EC2 instance configuration and data
are backed up, and copying the backups to another Region will ensure that the application is recoverable in a different AWS Region.

NEW QUESTION 224


- (Exam Topic 3)
A solutions architect must migrate a Windows Internet Information Services (IIS) web application to AWS The application currently relies on a file share hosted in
the user's on-premises network-attached storage (NAS) The solutions architect has proposed migrating the MS web servers to Amazon EC2 instances in multiple
Availability Zones that are connected to the storage solution, and configuring an Elastic Load Balancer attached to the instances
Which replacement to the on-premises file share is MOST resilient and durable?

A. Migrate the file share to Amazon RDS


B. Migrate the file share to AWS Storage Gateway
C. Migrate the file share to Amazon FSx for Windows File Server
D. Migrate the file share to Amazon Elastic File System (Amazon EFS)

Answer: C

NEW QUESTION 227


- (Exam Topic 3)
A company runs a fleet of web servers using an Amazon RDS for PostgreSQL DB instance After a routine compliance check, the company sets a standard that
requires a recovery pant objective (RPO) of less than 1 second for all its production databases.
Which solution meets these requirement?

A. Enable a Multi-AZ deployment for the DB Instance


B. Enable auto scaling for the OB instance m one Availability Zone.
C. Configure the 06 instance in one Availability Zone and create multiple read replicas in a separate Availability Zone
D. Configure the 06 instance m one Availability Zone, and configure AWS Database Migration Service (AWS DMS) change data capture (CDC) tasks

Answer: A

NEW QUESTION 231


- (Exam Topic 3)
A company uses Amazon EC2 instances and AWS Lambda functions to run its application. The company has VPCs with public subnets and private subnets in its
AWS account. The EC2 instances run in a private subnet in one of the VPCs. The Lambda functions need direct network access to the EC2 instances for the
application to work.
The application will run for at least 1 year. The company expects the number of Lambda functions that the application uses to increase during that time. The
company wants to maximize its savings on all application resources and to keep network latency between the services low.
Which solution will meet these requirements?

A. Purchase on an EC2 instance Savings Pla


B. Optimize the Lambda functions duration and memory usage and the number of invocation
C. Connect the Lambda functions to the private subnet that contains the EC2 instances.
D. Purchase on an EC2 instance Savings Pla
E. Optimize the Lambda functions duration and memory usage and the number of invocation, and the amount of data that is transfere
F. Connect the Lambda functions to a public subnet in the same VPC where the EC2 instances run.
G. Purchase a Compute Savings Pla
H. Optimize the Lambda functions duration and memory usage, the number of invocations, and the amount of data that is transferred Connect the Lambda function
to the Private subnet that contains the EC2 instances.
I. Purchase a Compute Savings Pla
J. Optimize the Lambda functions‘ duration and memory usage, the number of invocations, and the amount of data that is transferred Keep the Lambda functions
in the Lambda service VPC.

Answer: C

NEW QUESTION 234

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

- (Exam Topic 3)
A company is building a solution that will report Amazon EC2 Auto Scaling events across all the applications in an AWS account. The company needs to use a
serverless solution to store the EC2 Auto Scaling status data in Amazon S3. The company then will use the data in Amazon S3 to provide near-real-time updates
in a dashboard. The solution must not affect the speed of EC2 instance launches.
How should the company move the data to Amazon S3 to meet these requirements?

A. Use an Amazon CloudWatch metric stream to send the EC2 Auto Scaling status data to Amazon Kinesis Data Firehos
B. Store the data in Amazon S3.
C. Launch an Amazon EMR cluster to collect the EC2 Auto Scaling status data and send the data to Amazon Kinesis Data Firehos
D. Store the data in Amazon S3.
E. Create an Amazon EventBridge rule to invoke an AWS Lambda function on a schedul
F. Configure the Lambda function to send the EC2 Auto Scaling status data directly to Amazon S3.
G. Use a bootstrap script during the launch of an EC2 instance to install Amazon Kinesis Agen
H. Configure Kinesis Agent to collect the EC2 Auto Scaling status data and send the data to Amazon Kinesis Data Firehos
I. Store the data in Amazon S3.

Answer: A

Explanation:
You can use metric streams to continually stream CloudWatch metrics to a destination of your choice, with near-real-time delivery and low latency. One of the use
cases is Data Lake: create a metric stream and direct it to an Amazon Kinesis Data Firehose delivery stream that delivers your CloudWatch metrics to a data lake
such as Amazon S3.
https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch-Metric-Streams.html

NEW QUESTION 236


- (Exam Topic 3)
A company has launched an Amazon RDS for MySQL D6 instance Most of the connections to the database come from serverless applications. Application traffic
to the database changes significantly at random intervals At limes of high demand, users report that their applications experience database connection rejection
errors.
Which solution will resolve this issue with the LEAST operational overhead?

A. Create a proxy in RDS Proxy Configure the users' applications to use the DB instance through RDS Proxy
B. Deploy Amazon ElastCache for Memcached between the users' application and the DB instance
C. Migrate the DB instance to a different instance class that has higher I/O capacit
D. Configure the users' applications to use the new DB instance.
E. Configure Multi-AZ for the DB instance Configure the users' application to switch between the DB instances.

Answer: A

NEW QUESTION 238


- (Exam Topic 3)
A company wants to use Amazon S3 for the secondary copy of its on-premises dataset. The company would rarely need to access this copy. The storage
solution’s cost should be minimal.
Which storage solution meets these requirements?

A. S3 Standard
B. S3 Intelligent-Tiering
C. S3 Standard-Infrequent Access (S3 Standard-IA)
D. S3 One Zone-Infrequent Access (S3 One Zone-IA)

Answer: C

NEW QUESTION 242


- (Exam Topic 3)
A company needs to ingested and handle large amounts of streaming data that its application generates. The application runs on Amazon EC2 instances and
sends data to Amazon Kinesis Data Streams. which is contained wild default settings. Every other day the application consumes the data and writes the data to an
Amazon S3 bucket for business intelligence (BI) processing the company observes that Amazon S3 is not receiving all the data that trio application sends to
Kinesis Data Streams.
What should a solutions architect do to resolve this issue?

A. Update the Kinesis Data Streams default settings by modifying the data retention period.
B. Update the application to use the Kinesis Producer Library (KPL) lo send the data to Kinesis Data Streams.
C. Update the number of Kinesis shards lo handle the throughput of me data that is sent to Kinesis Data Streams.
D. Turn on S3 Versioning within the S3 bucket to preserve every version of every object that is ingested in the S3 bucket.

Answer: A

NEW QUESTION 245


- (Exam Topic 3)
A company must migrate 20 TB of data from a data center to the AWS Cloud within 30 days. The company's network bandwidth is limited to 15 Mbps and cannot
exceed 70% utilization. What should a solutions architect do to meet these requirements?

A. Use AWS Snowball.


B. Use AWS DataSync.
C. Use a secure VPN connection.
D. Use Amazon S3 Transfer Acceleration.

Answer: A

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

Explanation:
AWS Snowball is a secure data transport solution that accelerates moving large amounts of data into and out of the AWS cloud. It can move up to 80 TB of data at
a time, and provides a network bandwidth of up to 50 Mbps, so it is well-suited for the task. Additionally, it is secure and easy to use, making it the ideal solution for
this migration.

NEW QUESTION 249


- (Exam Topic 3)
A company wants to use high performance computing (HPC) infrastructure on AWS for financial risk modeling. The company's HPC workloads run on Linux. Each
HPC workflow runs on hundreds of Amazon EC2 Spot Instances, is shorl-lived, and generates thousands of output files that are ultimately stored in persistent
storage for analytics and long-term future use.
The company seeks a cloud storage solution that permits the copying of on-premises data to long-term persistent storage to make data available for processing by
all EC2 instances. The solution should also be a high performance file system that is integrated with persistent storage to read and write datasets and output files.
Which combination of AWS services meets these requirements?

A. Amazon FSx for Lustre integrated with Amazon S3


B. Amazon FSx for Windows File Server integrated with Amazon S3
C. Amazon S3 Glacier integrated with Amazon Elastic Block Store (Amazon EBS)
D. Amazon S3 bucket with a VPC endpoint integrated with an Amazon Elastic Block Store (Amazon EBS) General Purpose SSD (gp2) volume

Answer: A

Explanation:
https://aws.amazon.com/fsx/lustre/
Amazon FSx for Lustre is a fully managed service that provides cost-effective, high-performance, scalable storage for compute workloads. Many workloads such
as machine learning, high performance computing (HPC), video rendering, and financial simulations depend on compute instances accessing the same set of data
through high-performance shared storage.

NEW QUESTION 251


- (Exam Topic 3)
A company is using Amazon Route 53 latency-based routing to route requests to its UDP-based application for users around the world. The application is hosted
on redundant servers in the company's on-premises data centers in the United States. Asia, and Europe. The company's compliance requirements state that the
application must be hosted on premises The company wants to improve the performance and availability of the application
What should a solutions architect do to meet these requirements?

A. A Configure three Network Load Balancers (NLBs) in the three AWS Regions to address theon-premises endpoints Create an accelerator by using AWS Global
Accelerator, and register the NLBs as its endpoint
B. Provide access to the application by using a CNAME that points to the accelerator DNS
C. Configure three Application Load Balancers (ALBs) in the three AWS Regions to address theon-premises endpoint
D. Create an accelerator by using AWS Global Accelerator and register the ALBs as its endpoints Provide access to the application by using a CNAME that points
to the accelerator DNS
E. Configure three Network Load Balancers (NLBs) in the three AWS Regions to address the on-premises endpoints In Route 53. create a latency-based record
that points to the three NLB
F. and use it as an origin for an Amazon CloudFront distribution Provide access to the application by using a CNAME that points to the CloudFront DNS
G. Configure three Application Load Balancers (ALBs) in the three AWS Regions to address theon-premises endpoints In Route 53 create a latency-based record
that points to the three ALBs and use it as an origin for an Amazon CloudFront distribution- Provide access to the application by using a CNAME that points to the
CloudFront DNS

Answer: A

NEW QUESTION 255


- (Exam Topic 3)
A solution architect is designing a company’s disaster recovery (DR) architecture. The company has a MySQL database that runs on an Amazon EC2 instance in
a private subnet with scheduled backup. The DR design to include multiple AWS Regions.
Which solution will meet these requiements with the LEAST operational overhead?

A. Migrate the MySQL database to multiple EC2 instance


B. Configure a standby EC2 instance in the DR Region Turn on replication.
C. Migrate the MySQL database to Amazon RD
D. Use a Multi-AZ deploymen
E. Turn on read replication for the primary DB instance in the different Availability Zones.
F. Migrate the MySQL database to an Amazon Aurora global databas
G. Host the primary DB cluster in the primary Regio
H. Host the secondary DB cluster in the DR Region.
I. Store the schedule backup of the MySQL database in an Amazon S3 bucket that is configured for S3 Cross-Region Replication (CRR). Use the data backup to
restore the database in the DR Region.

Answer: C

NEW QUESTION 259


- (Exam Topic 3)
A hospital is designing a new application that gathers symptoms from patients. The hospital has decided to use Amazon Simple Queue Service (Amazon SOS)
and Amazon Simple Notification Service (Amazon SNS) in the architecture.
A solutions architect is reviewing the infrastructure design Data must be encrypted at test and in transit. Only authorized personnel of the hospital should be able to
access the data.
Which combination of steps should the solutions architect take to meet these requirements? (Select TWO.)

A. Turn on server-side encryption on the SQS components Update tie default key policy to restrict key usage to a set of authorized principals.
B. Turn on server-side encryption on the SNS components by using an AWS Key Management Service (AWS KMS) customer managed key Apply a key policy to
restrict key usage to a set of authorized principals.

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

C. Turn on encryption on the SNS components Update the default key policy to restrict key usage to a set of authorized principal
D. Set a condition in the topic pokey to allow only encrypted connections over TLS.
E. Turn on server-side encryption on the SOS components by using an AWS Key Management Service (AWS KMS) customer managed key Apply a key pokey to
restrict key usage to a set of authorized principal
F. Set a condition in the queue pokey to allow only encrypted connections over TLS.
G. Turn on server-side encryption on the SOS components by using an AWS Key Management Service (AWS KMS) customer managed ke
H. Apply an IAM pokey to restrict key usage to a set of authorized principal
I. Set a condition in the queue pokey to allow only encrypted connections over TLS

Answer: BD

NEW QUESTION 263


- (Exam Topic 3)
A company is deploying a new application on Amazon EC2 instances. The application writes data to Amazon Elastic Block Store (Amazon EBS) volumes. The
company needs to ensure that all data that is written to the EBS volumes is encrypted at rest.
Which solution wil meet this requirement?

A. Create an IAM role that specifies EBS encryptio


B. Attach the role to the EC2 instances.
C. Create the EBS volumes as encrypted volumes Attach the EBS volumes to the EC2 instances.
D. Create an EC2 instance tag that has a key of Encrypt and a value of Tru
E. Tag all instances that require encryption at the ESS level.
F. Create an AWS Key Management Service (AWS KMS) key policy that enforces EBS encryption in the account Ensure that the key policy is active.

Answer: B

NEW QUESTION 266


- (Exam Topic 3)
A company hosts a multiplayer gaming application on AWS. The company wants the application to read data with sub-millisecond latency and run one-time
queries on historical data.
Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon RDS for data that is frequently accesse


B. Run a periodic custom script to export the data to an Amazon S3 bucket.
C. Store the data directly in an Amazon S3 bucke
D. Implement an S3 Lifecycle policy to move older data to S3 Glacier Deep Archive for long-term storag
E. Run one-time queries on the data in Amazon S3 by using Amazon Athena
F. Use Amazon DynamoDB with DynamoDB Accelerator (DAX) for data that is frequently accessed.Export the data to an Amazon S3 bucket by using DynamoDB
table expor
G. Run one-time queries on the data in Amazon S3 by using Amazon Athena.
H. Use Amazon DynamoDB for data that is frequently accessed Turn on streaming to Amazon Kinesis Data Stream
I. Use Amazon Kinesis Data Firehose to read the data from Kinesis Data Stream
J. Store the records in an Amazon S3 bucket.

Answer: C

NEW QUESTION 270


- (Exam Topic 3)
A company collects data from thousands of remote devices by using a RESTful web services application that runs on an Amazon EC2 instance. The EC2 instance
receives the raw data, transforms the raw data, and stores all the data in an Amazon S3 bucket. The number of remote devices will increase into the millions soon.
The company needs a highly scalable solution that minimizes operational overhead.
Which combination of steps should a solutions architect take to meet these requirements9 (Select TWO.)

A. Use AWS Glue to process the raw data in Amazon S3.


B. Use Amazon Route 53 to route traffic to different EC2 instances.
C. Add more EC2 instances to accommodate the increasing amount of incoming data.
D. Send the raw data to Amazon Simple Queue Service (Amazon SOS). Use EC2 instances to process the data.
E. Use Amazon API Gateway to send the raw data to an Amazon Kinesis data strea
F. Configure Amazon Kinesis Data Firehose to use the data stream as a source to deliver the data to Amazon S3.

Answer: AE

Explanation:
"RESTful web services" => API Gateway.
"EC2 instance receives the raw data, transforms the raw data, and stores all the data in an Amazon S3 bucket"
=> GLUE with (Extract - Transform - Load)

NEW QUESTION 274


- (Exam Topic 3)
A company has a three-tier environment on AWS that ingests sensor data from its users' devices The traffic flows through a Network Load Balancer (NIB) then to
Amazon EC2 instances for the web tier and finally to EC2 instances for the application tier that makes database calls
What should a solutions architect do to improve the security of data in transit to the web tier?

A. Configure a TLS listener and add the server certificate on the NLB
B. Configure AWS Shield Advanced and enable AWS WAF on the NLB
C. Change the load balancer to an Application Load Balancer and attach AWS WAF to it
D. Encrypt the Amazon Elastic Block Store (Amazon EBS) volume on the EC2 instances using AWS Key Management Service (AWS KMS)

Answer: A

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

NEW QUESTION 276


......

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (551 New Questions)

THANKS FOR TRYING THE DEMO OF OUR PRODUCT

Visit Our Site to Purchase the Full Set of Actual AWS-Solution-Architect-Associate Exam Questions With
Answers.

We Also Provide Practice Exam Software That Simulates Real Exam Environment And Has Many Self-Assessment Features. Order the AWS-
Solution-Architect-Associate Product From:

https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/

Money Back Guarantee

AWS-Solution-Architect-Associate Practice Exam Features:

* AWS-Solution-Architect-Associate Questions and Answers Updated Frequently

* AWS-Solution-Architect-Associate Practice Questions Verified by Expert Senior Certified Staff

* AWS-Solution-Architect-Associate Most Realistic Questions that Guarantee you a Pass on Your FirstTry

* AWS-Solution-Architect-Associate Practice Test Questions in Multiple Choice Formats and Updatesfor 1 Year

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Powered by TCPDF (www.tcpdf.org)

You might also like