Exam Details

  • Exam Code
    :SAA-C02
  • Exam Name
    :AWS Certified Solutions Architect - Associate (SAA-C02)
  • Certification
    :Amazon Certifications
  • Vendor
    :Amazon
  • Total Questions
    :1080 Q&As
  • Last Updated
    :Jun 04, 2025

Amazon Amazon Certifications SAA-C02 Questions & Answers

  • Question 871:

    A rapidly growing ecommerce company is running its workloads in a single AWS Region. A solutions architect must create a disaster recovery (DR) strategy that includes a different AWS Region. The company wants its database to be up to date in the DR Region with the least possible latency. The remaining infrastructure in the DR Region needs to run at reduced capacity and must be able to scale up if necessary.

    Which solution will meet these requirements with the LOWEST recovery time objective (RTO)?

    A. Use an Amazon Aurora global database with a pilot light deployment.

    B. Use an Amazon Aurora global database with a warm standby deployment.

    C. Use an Amazon RDS Multi-AZ DB instance with a pilot light deployment.

    D. Use an Amazon RDS Multi-AZ DB instance with a warm standby deployment.

  • Question 872:

    A company needs to develop a repeatable solution to process time-ordered information from websites around the world. The company collects the data from the websites by using Amazon Kinesis Data Streams and stores the data in Amazon

    S3.

    The processing logic needs to collect events and handle data from the last 5 years.

    The processing logic also must generate results m an S3 bucket so that a business intelligence application can analyze and compare the results. The processing must be repeated multiple times.

    What should a solutions architect do to meet these requirements?

    A. Use Amazon S3 to collect events. Create an AWS Lambda function to process the events. Create different Lambda functions to handle repeated processing.

    B. Use Amazon EventBridge (Amazon CloudWatch Events) to collect events Set AWS Lambda as an event target. Use EventBridge (CloudWatch Events) to create an archive for the events and to replay the events.

    C. Use an Amazon Simple Queue Service (Amazon SQS) FIFO queue to collect events. Process the events by using Amazon EC2. Use AWS Step Function to create an archive for the events and to replay the events.

    D. Use Amazon Managed Streaming for Apache Kafka (Amazon MSK) to collect events. Process the events by using Amazon Elastic Kubemetes Service (Amazon EKS) Use Amazon MSK to create an archive for the events and to replay the events.

  • Question 873:

    An image-processing company has a web application that users use to upload images. The application uploads the images into an Amazon S3 bucket. The company has set up S3 event notifications to publish the object creation events to an

    Amazon Simple Queue Service (Amazon SQS) standard queue. The SQS queue serves as the event source for an AWS Lambda function that processes the images and sends the results to users through email.

    Users report that they are receiving multiple email messages for every uploaded image. A solutions architect determines that SQS messages are invoking the Lambda function more than once, resulting in multiple email messages.

    What should the solutions architect do to resolve this issue with the LEAST operational overhead?

    A. Set up long polling in the SQS queue by increasing the ReceiveMessage wait time to 30 seconds.

    B. Change the SQS standard queue to an SQS FIFO queue. Use the message deduplication ID to discard duplicate messages.

    C. Increase the visibility timeout in the SQS queue to a value that is greater than the total of the function timeout and the batch window timeout.

    D. Modify the Lambda function to delete each message from the SQS queue immediately after the message is read before processing.

  • Question 874:

    A company uses 50 TB of data for reporting. The company wants to move this data from on premises to AWS A custom application in the company's data center runs a weekly data transformation job. The company plans to pause the

    application until the data transfer is complete and needs to begin the transfer process as soon as possible.

    The data center does not have any available network bandwidth for additional workloads A solutions architect must transfer the data and must configure the transformation job to continue to run in the AWS Cloud.

    Which solution will meet these requirements with the LEAST operational overhead?

    A. Use AWS DataSync to move the data Create a custom transformation job by using AWS Glue

    B. Order an AWS Snowcone device to move the data Deploy the transformation application to the device

    C. Order an AWS Snowball Edge Storage Optimized device. Copy the data to the device. Create a custom transformation job by using AWS Glue

    D. Order an AWS D. Snowball Edge Storage Optimized device that includes Amazon EC2 compute Copy the data to the device Create a new EC2 instance on AWS to run the transformation application

  • Question 875:

    A company hosts a multiplayer gaming application on AWS. The company wants the application to read data with sub-millisecond latency and run one-time queries on historical data. Which solution will meet these requirements with the LEAST operational overhead?

    A. Use Amazon RDS for data that is frequently accessed. Run a periodic custom script to export the data to an Amazon S3 bucket.

    B. Store the data directly in an Amazon S3 bucket. Implement an S3 Lifecycle policy to move older data to S3 Glacier Deep Archive for long-term storage. Run one-time queries on the data in Amazon S3 by using Amazon Athena

    C. Use Amazon DynamoDB with DynamoDB Accelerator (DAX) for data that is frequently accessed. Export the data to an Amazon S3 bucket by using DynamoDB table export. Run one-time queries on the data in Amazon S3 by using Amazon Athena.

    D. Use Amazon DynamoDB for data that is frequently accessed Turn on streaming to Amazon Kinesis Data Streams. Use Amazon Kinesis Data Firehose to read the data from Kinesis Data Streams. Store the records in an Amazon S3 bucket.

  • Question 876:

    A company uses a legacy application to produce data in CSV format The legacy application stores the output data In Amazon S3 The company is deploying a new commercial off-the-shelf (COTS) application that can perform complex SQL queries to analyze data that is stored Amazon Redshift and Amazon S3 only However the COTS application cannot process the csv files that the legacy application produces The company cannot update the legacy application to produce data in another format The company needs to implement a solution so that the COTS application can use the data that the legacy applicator produces.

    Which solution will meet these requirements with the LEAST operational overhead?

    A. Create a AWS Glue extract, transform, and load (ETL) job that runs on a schedule. Configure the ETL job to process the .csv files and store the processed data in Amazon Redshit.

    B. Develop a Python script that runs on Amazon EC2 instances to convert the. csv files to sql files invoke the Python script on cron schedule to store the output files in Amazon S3.

    C. Create an AWS Lambda function and an Amazon DynamoDB table. Use an S3 event to invoke the Lambda function. Configure the Lambda function to perform an extract transform, and load (ETL) job to process the .csv files and store the processed data in the DynamoDB table.

    D. Use Amazon EventBridge (Amazon CloudWatch Events) to launch an Amazon EMR cluster on a weekly schedule. Configure the EMR cluster to perform an extract, tractform, and load (ETL) job to process the .csv files and store the processed data in an Amazon Redshift table.

  • Question 877:

    A company has a web application that is based ornavaan^PH^Tnecompan^lanstomove the application from on premises to AWS The company needs the ability to test new site features frequently The company also needs a highly available and managed solution that requires minimum operational overhead.

    Which solution will meet these requirements?

    A. Create an Amazon S3 bucket Enable static web hosting on the S3 bucket Upload the static content to the S3 bucket Use AWS Lambda to process all dynamic content

    B. Deploy the web application to an AWS Elastic Beanstalk environment Use URL swapping to switch between multiple Elastic Beanstalk environments for feature testing

    C. Deploy the web application to Amazon EC2 instances that are configured with Java and PHP Use Auto Scaling groups and an Application Load Balancer to manage the website's availability.

    D. Containerize the web application Deploy the web application to Amazon EC2 instances Use the AWS Load Balancer Controller to dynamically route traffic between containers that contain the new site features for testing

  • Question 878:

    A company wants to migrate its on-premises application to AWS. The application produces output files that vary in size from tens of gigabytes to hundreds of terabytes The application data must be stored in a standard file system structure The company wants a solution that scales automatically, is highly available, and requires minimum operational overhead.

    Which solution will meet these requirements?

    A. Migrate the application to run as containers on Amazon Elastic Container Service (Amazon ECS) Use Amazon S3 for storage

    B. Migrate the application to run as containers on Amazon Elastic Kubernetes Service (Amazon EKS) Use Amazon Elastic Block Store (Amazon EBS) for storage

    C. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling group. Use Amazon Elastic File System (Amazon EFS) for storage.

    D. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling group. Use Amazon Elastic Block Store (Amazon EBS) for storage.

  • Question 879:

    A company has an on-premises MySQL database that handles transactional data The company is migrating the database to the AWS Cloud The migrated database must maintain compatibility with the company's applications that use the database The migrated database also must scale automatically during periods of increased demand.

    Which migration solution will meet these requirements?

    A. Use native MySQL tools to migrate the database to Amazon RDS for MySQL Configure elastic storage scaling

    B. Migrate the database to Amazon Redshift by using the mysqldump utility Turn on Auto Scaling for the Amazon Redshift cluster

    C. Use AWS Database Migration Service (AWS DMS) to migrate the database to Amazon Aurora Turn on Aurora Auto Scaling.

    D. Use AWS Database Migration Service (AWS DMS) to migrate the database to Amazon DynamoDB Configure an Auto Scaling policy.

  • Question 880:

    A company has an Amazon S3 data lake that is governed by AWS Lake Formation The company wants to create a visualization in Amazon QuickSight by joining the data in the data lake with operational data that is stored in an Amazon Aurora MySQL database The company wants to enforce column-level authorization so that the company's marketing team can access only a subset of columns in the database

    Which solution will meet these requirements with the LEAST operational overhead?

    A. Use Amazon EMR to ingest the data directly from the database to the QuickSight SPICE engine Include only the required columns

    B. Use AWS Glue Studio to ingest the data from the database to the S3 data lake Attach an 1AM policy to the QuickSight users to enforce column-level access control. Use Amazon S3 as the data source in QuickSight

    C. Use AWS Glue Elastic Views to create a materialized view for the database in Amazon S3 Create an S3 bucket policy to enforce column-level access control for the QuickSight users Use Amazon S3 as the data source in QuickSight.

    D. Use a Lake Formation blueprint to ingest the data from the database to the S3 data lake Use Lake Formation to enforce column-level access control for the QuickSight users Use Amazon Athena as the data source in QuickSight

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your SAA-C02 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.