Which AWS service can a company use to store and manage Docker images?
A. Amazon DynamoDB
B. Amazon Kinesis Data Streams
C. Amazon Elastic Container Registry (Amazon ECR)
D. Amazon Elastic File System (Amazon EFS)
A company has an application in which users create a large number of files The company plans to migrate the application from its on-premises data center to AWS Currently, the application uploads the files to a shared storage system A separate fleet of servers then processes the files Access to the files is controlled through Linux file system permissions The company needs to migrate the fleet of servers to Amazon EC2 instances The company must maximize storage scalability and durability without changing the code of the existing application Which solution will meet these requirements?
A. Migrate the files to an Amazon S3 bucket Mount the S3 bucket on the EC2 instances
B. Migrate the files to a set of Amazon EC2 instance store volumes Mount the instance store volumes on the EC2 instances
C. Migrate the files to a set of Amazon Elastic Block Store (Amazon EBS) volumes Mount the EBS volumes on the EC2 instances
D. Migrate the files to an Amazon Elastic File System (Amazon EFS) file system Mount the EFS file system on the EC2 instances
A company copies 200 TB of data from a recent ocean survey onto AWS Snowball Edge Storage Optimized devices The company has a high performance computing (HPC) cluster that is hosted on AWS to look for oil and gas deposits A solutions architect must provide the cluster with consistent sub- millisecond latency and high-throughput access to the data on the Snowball Edge Storage Optimized devices The company is sending the devices back to AWS
Which solution will meet these requirements'?
A. Create an Amazon S3 bucket Import the data into the S3 bucket Configure an AWS Storage Gateway file gateway to use the S3 bucket Access the file gateway from the HPC cluster instances
B. Create an Amazon S3 bucket Import the data into the S3 bucket Configure an Amazon FSx for Lustre file system and integrate it with the S3 bucket Access the FSx for Lustre file system from the HPC cluster instances
C. Create an Amazon S3 bucket and an Amazon Elastic File System (Amazon EFS) file system Import the data into the S3 bucket Copy the data from the S3 bucket to the EFS file system Access the EFS file system from the HPC cluster instances
D. Create an Amazon FSx for Lustre file system Import the data directly into the FSx for Lustre file system Access the FSx for Lustre file system from the HPC cluster instances
A company runs a shopping application lhat uses Amazon DynamoDB to store customer information. In case of data corruption, a solutions architect needs to design a solution that meets a recovery point objective (RPO) of 15 minutes and a recovery time objective (RTO> of 1 hour.
What should the solutions architect recommend to meet these requirements?
A. Configure DynamoDB global tables. For RPO recovery, point the application to a different AWS Region.
B. Configure DynamoDB point-in-time recovery. For RPO recovery, restore to the desired point in time.
C. Export the DynamoDB data to Amazon S3 Glacier on a daily basis. For RPO recovery, import the data from S3 Glacier to DynamoDB.
D. Schedule Amazon Elastic Block Store (Amazon EBS) snapshots for the DynamoDB table every 15 minutes. For RPO recovery, restore the DynamoDB table by using the EBS snapshot.
A company processes images into thumbnails and returns an email confirmation to the end user upon completion. The company's existing solution is facing performance bottlenecks and scalability issues. The company wants to migrate this process to AWS and implement a solution that requires the least possible configuration
Which solution meets these requirements?
A. Use Amazon S3 to store images and send notifications to AWS Lambda Configure an AWS Lambda function to process the images into thumbnails, store the thumbnails in Amazon S3, and send an email confirmation through Amazon Simple Email Service (Amazon SES)
B. Use Amazon S3 to store images and send notifications to Amazon Simple Queue Service (Amazon SQS) Configure an Amazon EC2 instance to poll the SQS queue to process the images into thumbnails, store the thumbnails in Amazon S3, and send an email confirmation through Amazon Simple Email Service (Amazon SES)
C. Use Amazon S3 to store images and send notifications to Amazon Simple Notification Service (Amazon SNS) Configure Amazon SNS to invoke an AWS Lambda function to process the images into thumbnails, store the thumbnails in Amazon S3, and send an email confirmation through Amazon Simple Email Service (Amazon SES).
D. Use Amazon S3 to store images and send notifications to Amazon Simple Queue Service (Amazon SQS) Configure an AWS Lambda function to retrieve the messages from the SQS queue process the images into thumbnails, store the thumbnails in Amazon S3, and send an email confirmation through Amazon Simple Email Service (Amazon SES)
A company needs to build a reporting solution on AWS. The solution must support SQL queries that data analysts run on the data. The data analysts will run lower than 10 total queries each day. The company generates 3 GB of new data daily in an on-premises relational database. This data needs to be transferred to AWS to perform reporting tasks.
What should a solutions architect recommend to meet these requirements at the LOWEST cost?
A. Use AWS Database Migration Service (AWS DMS) to replicate the data from the on- premises database into Amazon S3. Use Amazon Athena to query the data.
B. Use an Amazon Kinesis Data Firehose delivery stream to deliver the data into an Amazon Elasticsearch Service (Amazon ES) cluster Run the queries in Amazon ES.
C. Export a daily copy of the data from the on-premises database. Use an AWS Storage Gateway file gateway to store and copy the export into Amazon S3. Use an Amazon EMR cluster to query the data.
D. Use AWS Database Migration Service (AWS DMS) to replicate the data from the on- premises database and load it into an Amazon Redshift cluster. Use the Amazon Redshift cluster to query the data.
A company is upgrading its critical web-based application. The application is hosted on Amazon EC2 instances that are part of an Auto Scaling group behind an Application Load Balancer (ALB). The company wants to test the new configurations with a specific amount of traffic before the company begins to route all traffic to the upgraded application.
How should a solutions architect design the architecture to meet these requirements?
A. Create a new launch template. Associate the new launch template with the Auto Scaling group. Attach the Auto Scaling group to the ALB. Distribute traffic by using redirect rules.
B. Create a new launch template. Create an additional Auto Scaling group. Associate the new launch template with the additional Auto Scaling group. Attach the additional Auto Scaling group to the ALB. Distribute traffic by using weighted target groups.
C. Create a new launch template. Create an additional Auto Scaling group. Associate the new launch template with the additional Auto Scaling group. Create an additional ALB. Attach the additional Auto Scaling group to the additional ALB. Use an Amazon Route 53 failover routing policy to route traffic.
D. Create a new launch template. Create an additional Auto Scaling group. Associate the new launch template with the additional Auto Scaling group. Create an additional ALB. Attach the additional Auto Scaling group to the additional ALB. Use an Amazon Route 53 weighted routing policy to route traffic.
A company has deployed a database in Amazon RDS for MySQL. Due to increased transactions, the database support team is reporting slow reads against the DB instance and recommends adding a read replica.
Which combination of actions should a solutions architect take before implementing this change? {Select TWO.)
A. Enable binlog replication on the RDS primary node.
B. Choose a failover priority for the source DB instance.
C. Allow long-running transactions to complete on the source DB instance.
D. Create a global table and specify the AWS Regions where the table will be available.
E. Enable automatic backups on the source instance by setting the backup retention period to a value other than 0.
A company has a three-tier application image sharing. The application uses an Amazon EC2 instance for the front-end layer, another EC2 instance tor the application layer, and a third EC2 instance for a MySQL database A solutions architect must design a scalable and nighty available solution mat requires the least amount of change to the application.
Which solution meets these requirement?
A. Use Amazon S3 to host the front-end layer. Use AWS Lambda functions for the application layer. Move the database to an Amazon DynamoDB table Use Amazon S3 to store and service users' images.
B. Use toad-balanced Multi-AZ AWS Elastic Beanstalk environments for the front-end layer and the application layer. Move the database to an Amazon RDS OB instance with multiple read replicas to serve users' images.
C. Use Amazon S3 to host the front-end layer. Use a fleet of EC2 instances in an Auto Scaling group for the application layer. Move the database to a memory optimized instance type to store and serve users' images.
D. Use toad-balanced Multi-AZ AWS Elastic Beanstark environments for tie front-end layer and the application layer. Move the database to an Amazon ROS Multi-AZ DB instance Use Amazon S3 to store and serve users' images.
A company is building a shopping application on AWS. The application offers a catalog that changes once each month and needs to scale with traffic volume. The company wants the lowest possible latency from the application. Data from each user's shopping cart needs to be highly available. User session data must be available even if the user is disconnected and reconnects.
What should a solutions architect do to ensure that the shopping cart data is preserved at all times?
A. Configure an Application Load Balancer to enable the sticky sessions feature (session affinity) for access to the catalog in Amazon Aurora.
B. Configure Amazon ElastiCache for Redis to cache catalog data from Amazon DynamoDB and shopping cart data from the user's session.
C. Configure Amazon Elasticsearch Service (Amazon ES) to cache catalog data from Amazon DynamoDB and shopping can data from the user's session.
D. Configure an Amazon EC2 instance with Amazon Elastic Block Store (Amazon EBS) storage for the catalog and shopping cart. Configure automated snapshots.
Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your SAA-C02 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.