web analytics

Braindump2go Free Microsoft Dumps Questions Collection

Latest Real Exam Questions and Answers to help you pass Microsoft and other Hot exam 100%!

[2020-August-New]High Quality Braindump2go SAA-C02 Dumps PDF SAA-C02 301Q Free Share[263-283]

August/2020 New Braindump2go SAA-C02 Exam Dumps with PDF and VCE are Free Updated Today! Following are some new SAA-C02 Real Exam Questions!

QUESTION 263
A solutions architect is designing a multi-Region disaster recovery solution for an application that will provide public API access.
The application will use Amazon EC2 instances with a userdata script to load application code and an Amazon RDS for MySQL database.
The Recovery Time Objective (RTO) is 3 hours and the Recovery Point Objective (RPO) is 24 hours.
Which architecture would meet these requirements at the LOWEST cost?

A.    Use an Application Load Balancer for Region failover.
Deploy new EC2 instances with the userdata script.
Deploy separate RDS instances in each Region
B.    Use Amazon Route 53 for Region failover.
Deploy new EC2 instances with the userdata script.
Create a read replica of the RDS instance in a backup Region
C.    Use Amazon API Gateway for the public APIs and Region failover.
Deploy new EC2 instances with the userdata script.
Create a MySQL read replica of the RDS instance in a backup Region
D.    Use Amazon Route 53 for Region failover.
Deploy new EC2 instances with the userdata scnpt for APIs, and create a snapshot of the RDS instance daily for a backup.
Replicate the snapshot to a backup Region

Answer: C

QUESTION 264
A solutions architect is designing a new API using Amazon API Gateway that will receive requests from users.
The volume of requests is highly variable, several hours can pass without receiving a single request.
The data processing will take place asynchronously but should be completed within a few seconds after a request is made
Which compute service should the solutions architect have the API invoke to deliver the requirements at the lowest cost?

A.    An AWS Glue job
B.    An AWS Lambda function
C.    A containerized service hosted in Amazon Elastic Kubernetes Service (Amazon EKS)
D.    A containerized service hosted in Amazon ECS with Amazon EC2

Answer: C

QUESTION 265
A development team needs to host a website that will be accessed by other teams.
The website contents.consist of HTML. CSS, client side JavaScript, and images.
Which method is the MOST cost-effective for hosting the website?

A.    Containerize the website and host it in AWS Fargate
B.    Create an Amazon S3 bucket and host the website there.
C.    Deploy a web server on an Amazon EC2 instance to host the website.
D.    Configure an Application Load Balancer with an AWS Lambda target that uses the Express is framework

Answer: B

QUESTION 266
A company has media and application files that need to be shared internally.
Users currently are authenticated using Active Directory and access files from a Microsoft Windows platform.
The chief execute officer wants to keep the same user permissions, but wants the company to improve the process as the company is reaching its storage capacity limit.
What should a solutions architect recommend?

A.    Set up a corporate Amazon S3 bucket and move and media and application files.
B.    Configure Amazon FSx for Windows File Server and move all the media and application files.
C.    Configure Amazon Elastic File System (Amazon EFS) and move all media and application files.
D.    Set up Amazon EC2 on Windows, attach multiple Amazon Elastic Block Store (Amazon EBS) volumes and, and move all media and application files.

Answer: A

QUESTION 267
A company is moving its legacy workload to the AWS Cloud.
The workload files will be shared, appended, and frequently accessed through Amazon EC2 instances when they are first created.
The files will be accessed occasionally as they age
What should a solutions architect recommend?

A.    Store the data using Amazon EC2 instances with attached Amazon Elastic Block Store (Amazon EBS) data volumes
B.    Store the data using AWS Storage Gateway volume gateway and export rarely accessed data to Amazon S3 storage
C.    Store the data using Amazon Elastic File System (Amazon EFS) with lifecycle management enabled for rarely accessed data
D.    Store the data using Amazon S3 with an S3 lifecycle policy enabled to move data to S3 Standard-Infrequent Access (S3 Standard-IA)

Answer: D

QUESTION 268
A company is deploying a multi-instance application within AWS that requires minimal latency between the instances.
What should a solutions architect recommend?

A.    Use an Auto Scaling group with a cluster placement group.
B.    Use an Auto Scaling group with single Availability Zone in the same AWS Region.
C.    Use an Auto Scaling group with multiple Availability Zones in the same AWS Region.
D.    Use a Network Load Balancer with multiple Amazon EC2 Dedicated Hosts as the targets

Answer: B

QUESTION 269
A company receives structured and semi-structured data from various sources once every day.
A solutions architect needs to design a solution that leverages big data processing frameworks.
The data should be accessible using SQL queries and business intelligence tools.
What should the solutions architect recommend to build the MOST high-performing solution?

A.    Use AWS Glue to process data and Amazon S3 to store data
B.    Use Amazon EMR to process data and Amazon Redshift to store data
C.    Use Amazon EC2 to process data and Amazon Elastic Block Store (Amazon EBS) to store data
D.    Use Amazon Kinesis Data Analytics to process data and Amazon Elastic File System (Amazon EFS) to store data

Answer: A

QUESTION 270
Company is designing a website that uses an Amazon S3 bucket to store static images.
The company wants ail future requests have taster response times while reducing both latency and cost.
Which service configuration should a solutions architect recommend?

A.    Deploy a NAT server in front of Amazon S3.
B.    Deploy Amazon CloudFront in front of Amazon S3.
C.    Deploy a Network Load Balancer in front of Amazon S3.
D.    Configure Auto Scaling to automatically adjust the capacity of the website.

Answer: D

QUESTION 271
What should a solutions architect do to ensure that all objects uploaded to an Amazon S3 bucket are encrypted?

A.    Update the bucket policy to deny if the PutObject does not have an s3 x-amz-acl header set
B.    Update the bucket policy to deny if the PutObject does not have an s3 x-amz-acl header set to private
C.    Update the bucket policy to deny if the PutObject does not have an aws SecureTransport header set to true
D.    Update the bucket policy to deny if the PutObject does not have an x-amz-server-side-encryption header set

Answer: D

QUESTION 272
A company runs a high performance computing (HPC) workload on AWS.
The workload required low- latency network performance and high network throughput with tightly coupled node-to-node communication.
The Amazon EC2 instances are properly sized for compute and storage capacity, and are launched using default options.
What should a solutions architect propose to improve the performance of the workload’?

A.    Choose a cluster placement group while launching Amazon EC2 instances
B.    Choose dedicated instance tenancy while launching Amazon EC2 instances
C.    Choose an Elastic Inference accelerator while launching Amazon EC2 instances
D.    Choose the required capacity reservation while launching Amazon EC2 instances.

Answer: A

QUESTION 273
A company’s dynamic website is hosted using on-premises servers in the United States.
The company is launching its product in Europe and it wants to optimize site loading times for new European users.
The site’s backend must remain in the United States. The product is being launched in a few days, and an immediate solution is needed
What should the solutions architect recommend?

A.    Launch an Amazon EC2 instance in us-east-1 and migrate the site to it
B.    Move the website to Amazon S3 Use cross-Region replication between Regions.
C.    Use Amazon CloudFront with a custom origin pointing to the on-premises servers
D.    Use an Amazon Route 53 geoproximity routing policy pointing to on-premises servers

Answer: A

QUESTION 274
A company is building a media-sharing application and decides to use Amazon S3 for storage.
When a media file is uploaded the company starts a multi-step process to create thumbnails, identify objects in the images, transcode videos into standard formats and resolutions and extract and store the metadata to an Amazon DynamoDB table.
The metadata is used for searching and navigation. The amount of traffic is variable The solution must be able to scale to handle spikes in load without unnecessary expenses.
What should a solutions architect recommend to support this workload?

A.    Build the processing into the website or mobile app used to upload the content to Amazon S3.
Save the required data to the DynamoDB table when the objects are uploaded
B.    Trigger AWS Step Functions when an object is stored in the S3 bucket.
Have the Step Functions perform the steps needed to process the object and then write the metadata to the DynamoDB table
C.    Trigger an AWS Lambda function when an object is stored in the S3 bucket.
Have the Lambda function start AWS Batch to perform the steps to process the object.
Place the object data in the DynamoDB table when complete
D.    Trigger an AWS Lambda function to store an initial entry in the DynamoDB table when an object is uploaded to Amazon S3.
Use a program running on an Amazon EC2 instance in an Auto Scaling group to poll the index for unprocess use the program to perform the processing

Answer: D

QUESTION 275
A company has recently updated its internal security standards.
The company must now ensure all Amazon S3 buckets and Amazon Elastic Block Store (Amazon EBS) volumes are encrypted with keys created and periodically rotated by internal security specialists.
The company is looking for a native, software-based AWS service to accomplish this goal.
What should a solutions architect recommend as a solution?

A.    Use AWS Secrets Manager with customer master keys (CMKs) to store master key material and apply a routine to create a new CMK periodically and replace it in AWS Secrets Manager.
B.    Use AWS Key Management Service (AWS KMS) with customer master keys (CMKs) to store master key material and apply a routing to re-create a new key periodically and replace it in AWS KMS.
C.    Use an AWS CloudHSM cluster with customer master keys (CMKs) to store master key material and apply a routine a re-create a new key periodically and replace it in the CloudHSM cluster nodes.
D.    Use AWS Systems Manager Parameter Store with customer master keys (CMKs) keys to store master key material and apply a routine to re-create a new periodically and replace it in the Parameter Store.

Answer: B

QUESTION 276
A solution architect must design a solution that uses Amazon CloudFront with an Amazon S3 to store a static website.
The company security policy requires that all websites traffic be inspected by AWS WAF.
How should the solution architect company with these requirements?

A.    Configure an S3 bucket policy to accept requests coming from the AWS WAF Amazon Resource Name (ARN) only
B.    Configure Amazon CloudFront to forward all incoming requests to AWS WAF before requesting content from the S3 origin,
C.    Configure a security group that allows Amazon CloudFront IP addresses to access Amazon S3 only Associate AWS WAF to CloudFront.
D.    Configure Amazon CloudFront and Amazon S3 to use an origin access identity (OAI) to restrict access to the S3 bucket. Enable AWS WAF on the distribution.

Answer: B

QUESTION 277
A company has copied 1 PB of data from a colocation facility to an Amazon S3 bucket in the us-east-1 Region using an AWS Direct Connect link.
The company now wants to copy the data to another S3 bucket in the us-west-2 Region.
The colocation facility does not allow the use AWS Snowball.
What should a solutions architect recommend to accomplish this?

A.    Order a Snowball Edge device to copy the data from one Region to another Region.
B.    Transfer contents from the source S3 bucket to a target S3 bucket using the S3 console.
C.    Use the aws S3 sync command to copy data from the source bucket to the destination bucket.
D.    Add a cross-Region replication configuration to copy objects across S3 buckets in different Reg.

Answer: A

QUESTION 279
An engineering team is developing and deploying AWS Lambda functions.
The team needs to create roles and manage policies in AWS IAM to configure the permissions of the Lambda functions.
How should the permissions for the team be configured so they also adhere to the concept of least privilege?

A.    Create an IAM role with a managed policy attached.
Allow the engineering team and the Lambda functions to assume this role
B.    Create an IAM group for the engineering team with an lAMFullAccess policy attached.
Add all the users from the team to this IAM group
C.    Create an execution role for the Lambda functions.
Attach a managed policy that has permission boundaries specific to these Lambda functions
D.    Create an IAM role with a managed policy attached that has permission boundaries specific to the Lambda functions.
Allow the engineering team to assume this role.

Answer: A

QUESTION 280
A company needs a secure connection between its on-premises environment and AWS.
This connection does not need high bandwidth and will handle a small amount of traffic.
The connection should be set up quickly.
What is the MOST cost-effective method to establish this type of connection?

A.    Implement a client VPN
B.    Implement AWS Direct Connect
C.    Implement a bastion host on Amazon EC2 53D.
D.    Implement an AWS Site-to-Site VPN connection.

Answer: D

QUESTION 281
A company is building a payment application that must be highly available even during regional service disruptions.
A solutions architect must design a data storage solution that can be easily replicated and used in other AWS Regions.
The application also requires low-latency atomicity, consistency, isolation, and durability (ACID) transactions that need to be immediately available to generate reports.
The development team also needs to use SQL.
Which data storage solution meets these requirements’?

A.    Amazon Aurora Global Database
B.    Amazon DynamoDB global tables
C.    Amazon S3 with cross-Region replication and Amazon Athena
D.    MySQL on Amazon EC2 instances with Amazon Elastic Block Store (Amazon EBS) snapshot replication

Answer: C

QUESTION 282
A solutions architect is using Amazon S3 to design the storage architecture of a new digital media application.
The media files must be resilient to the loss of an Availability Zone Some files are accessed frequently while other files are rarely accessed in an unpredictable pattern.
The solutions architect must minimize the costs of storing and retrieving the media files.
Which storage option meets these requirements?

A.    S3 Standard
B.    S3 Intelligent-Tiering
C.    S3 Standard-Infrequent Access (S3 Standard-IA)
D.    S3 One Zone-Infrequent Access (S3 One Zone-IA)

Answer: B

QUESTION 283
A company uses a legacy on-premises analytics application that operates on gigabytes of csv files and represents months of data.
The legacy application cannot handle the growing size of csv files New csv files are added daily from various data sources to a central on-premises storage location.
The company wants to continue to support the legacy application while users learn AWS analytics services.
To achieve this, a solutions architect wants to maintain two synchronized copies of all the csv files on-premises and in Amazon S3.
Which solution should the solutions architect recommend?

A.    Deploy AWS DataSync on-premises.
Configure DataSync to continuously replicate the csv files between the company’s on-premises storage and the company’s S3 bucket
B.    Deploy an on-premises file gateway.
Configure data sources to write the csv files to the file gateway.
Point the legacy analytics application to the file gateway.
The file gateway should replicate the csv files to Amazon S3
C.    Deploy an on-premises volume gateway.
Configure data sources to write the csv files to the volume gateway.
Point the legacy analytics application to the volume gateway.
The volume gateway should replicate data to Amazon S3.
D.    Deploy AWS DataSync on-premises.
Configure DataSync to continuously replicate the csv files between on-premises and Amazon Elastic File System (Amazon EFS).
Enable replication from Amazon EFS to the company’s S3 bucket.

Answer: B


Resources From:

1.2020 Latest Braindump2go SAA-C02 Exam Dumps (PDF & VCE) Free Share:
https://www.braindump2go.com/saa-c02.html

2.2020 Latest Braindump2go SAA-C02 PDF and SAA-C02 VCE Dumps Free Share:
https://drive.google.com/drive/folders/1_5IK3H_eM74C6AKwU7sKaLn1rrn8xTfm?usp=sharing

3.2020 Free Braindump2go SAA-C02 PDF Download:
https://www.braindump2go.com/free-online-pdf/SAA-C02-Dumps(285-301).pdf
https://www.braindump2go.com/free-online-pdf/SAA-C02-PDF-Dumps(263-273).pdf
https://www.braindump2go.com/free-online-pdf/SAA-C02-VCE-Dumps(274-284).pdf

Free Resources from Braindump2go,We Devoted to Helping You 100% Pass All Exams!

, , , , , ,

Comments are currently closed.