Exam Details

  • Exam Code
    :SAA-C02
  • Exam Name
    :AWS Certified Solutions Architect - Associate (SAA-C02)
  • Certification
    :Amazon Certifications
  • Vendor
    :Amazon
  • Total Questions
    :1080 Q&As
  • Last Updated
    :Jun 04, 2025

Amazon Amazon Certifications SAA-C02 Questions & Answers

  • Question 721:

    A company uses an application to present metrics from sporting events to the public. The application must scale quickly during live events and must store these metrics for log-term reporting purposes. The company's architecture includes the following:

    1.

    Amazon EC2 instances that run in an Auto Scaling group in private subnets

    2.

    A network Load Balancer That runs in public subnets

    3.

    A MongoDB database cluster that runs across multiple EC2 instances

    A solutions architect mutt implement a solution that minimizes operational overhead The solution alto must be able to Kale automatically. What should the solutions architect set up to meet these requirements?

    A. An Amazon DynamoDB database

    B. An Amazon ROS for MySQL D6 instance

    C. EC2 instances that run MySQL

    D. Amazon Redshift

  • Question 722:

    A disaster relief company is designing a new solution to analyze real-time csv data. The data is collected by a network of thousands of research stations met are distributed across the world. The data volume is consistent and constant, and the size of each data We is 512 KB. The company needs to stream the data and analyze the data in real time.

    Which combination of actions should a solutions architect take to meet these requirements? (Select TWO.)

    A. Provision an appropriately sized Amazon Simple Queue Service (Amazon SQS) queue. Use the AWS SDK at the research stations to write the data into the SQS queue

    B. Provision an appropriately sized Amazon Kinesis Data Firehose delivery stream. Use the AWS SDK at the research stations to write the data into the delivery stream and then into an Amazon S3 bucket.

    C. Provision an appropriately sized Amazon Kinesis Data Analytics application. Use the AWS CLI to configure Kinesis Data Analytics with SOL queries

    D. Provision an AWS Lambda function to process the data. Set up the BatchSize property on the Lambda event source.

    E. Provision an AWS Lambda function to process the data. Set up an Amazon EventBridge (Amazon CloudWatch Events) cron expression rule to invoke the Lambda function

  • Question 723:

    A company runs a website on Amazon EC2 instances behind an ELB Application Load Balancer Amazon Route 53 Is used for the DNS The company wants to set up a backup website with a message including a phone number and email address that users can reach if the primary website is down. How should the company deploy this solution?

    A. Use Amazon S3 website hosting for the backup website and a Route 53 failover routing policy

    B. Use Amazon S3 website hosting for the backup website and a Route 53 latency routing policy

    C. Deploy the application in another AWS Region and use ELB health checks for failover routing.

    D. Deploy the application in another AWS Region and use server-side redirection on the primary website

  • Question 724:

    A company wants to move a multi-tiered application from on premises to the AWS Cloud to improve the application's performance. The application consists of application tiers that communicate with each other by way of

    Which solution moots these and is the MOST operationally efficient?

    A. Use Amazon API Gateway and direct transactions to the AWS Lambda functions as the application layer Use Amazon Simple Queue Service (Amazon SQS) as the communication layer between application services.

    B. Use Amazon CloudWatch metrics to analyze the application performance history to determine the servers' peak utilization during the performance failures Increase the size or the application servers Amazon EC2 instance to meet the peak requirements

    C. Use Amazon Simple Notification Service (Amazon SNS) to handle the messaging between application servers running on Amazon EC2 m an Auto Scaling group Use Amazon CloudWatch to monitor the SNS queue length and scale up and down as required.

    D. Use Amazon Simple Queue Service (Amazon SQS) to handle the messaging between application servers running on Amazon EC2 In an Auto Seeing group Use Amazon CloudWatch to monitor the SQS queue length and scale up when communication failures are detected.

  • Question 725:

    A developer is creating an AWS Lambda function to perform dynamic updates to a database when an item is added to an Amazon Simple Queue Service (Amazon SQS) queue A solutions architect must recommend a solution that tracks any usage of database credentials in AWS CloudTrail. The solution also must provide auditing capabilities.

    Which solution will meet these requirements?

    A. Store the encrypted credentials in a Lambda environment variable

    B. Create an Amazon DynamoDB table to store the credentials Encrypt the table

    C. Store the credentials as a secure string in AWS Systems Manager Parameter Store

    D. Use an AWS Key Management Service (AWS KMS) key store to store the credentials

  • Question 726:

    A development team is collaborating with another company to create an integrate product. The other company needs to access an Amazon Simple Queue Service (Amazon SQS) queue that is contained in the development team's account. The other company wants to poll the queue without giving up its own account permissions to do so.

    How should a solutions architect provide access to the 303 queue7

    A. Create an Instance profile that provides the other company access to the SQS queue

    B. Create an IAM policy that provides the other company access to the SQS queue.

    C. Create an SQS access policy that provides the other company access to the SQS queue

    D. Create an Amazon Simple Notification Service (Amazon SNS) aeons policy that provides the other company access to the SQS queue

  • Question 727:

    A company has designed an application where users provide small sets of textual data by calling a public API The application runs on AWS and includes a public Amazon API Gateway API that forwards requests to an AWS Lambda function for processing The Lambda function then writes the data to an Amazon Aurora Serverless database for consumption

    The company is concerned that it could lose some user data it a Lambda function fails to process the request property or reaches a concurrency limit.

    What should a solutions architect recommend to resolve this concern?

    A. Split the existing Lambda function into two Lambda functions Configure one function to receive API Gateway requests and put relevant items into Amazon Simple Queue Service (Amazon SQS) Configure the other function to read items from Amazon SQS and save the data into Aurora

    B. Configure the Lambda function to receive API Gateway requests and write relevant items to Amazon ElastiCache Configure ElastiCache to save the data into Aurora

    C. Increase the memory for the Lambda function Configure Aurora to use the Multi-AZ feature

    D. Split the existing Lambda function into two Lambda functions Configure one function to receive API Gateway requests and put relevant items into Amazon Simple Notification Service (Amazon SNS) Configure the other function to read items from Amazon SNS and save the data into Aurora

  • Question 728:

    A company is using a fleet of Amazon EC2 instances to ingest data from on-premises data sources. The data is in JSON format and Ingestion rates can be as high as 1 MB/s. When an EC2 instance is rebooted, the data in-flight is lost. The company's data science team wants to query Ingested data In near-real time.

    Which solution provides near-real -time data querying that is scalable with minimal data loss?

    A. Publish data to Amazon Kinesis Data Streams Use Kinesis data Analytics to query the data.

    B. Publish data to Amazon Kinesis Data Firehose with Amazon Redshift as the destination Use Amazon Redshift to query the data

    C. Store ingested data m an EC2 Instance store Publish data to Amazon Kinesis Data Firehose with Amazon S3 as the destination. Use Amazon Athena to query the data.

    D. Store ingested data m an Amazon Elastic Block Store (Amazon EBS) volume Publish data to Amazon ElastiCache tor Red Subscribe to the Redis channel to query the data

  • Question 729:

    A company hosts a training site on a fleet of Amazon EC2 instances. The company anticipates that its new course which consists of dozens of training videos on the site, will be extremely popular when it is released in 1 week What should a solutions architect do to minimize the anticipated server load?

    A. Store the videos in Amazon ElastiCache for Redis Update the web servers to serve the videos using the ElastiCache API

    B. Store the videos m Amazon Elastic File System (Amazon EFS) Create a user data script for the web servers to mount the EPS volume

    C. Store the videos m an Amazon S3 bucket Create an Amazon CloudFront distribution with an origin access identity (OAl) of that S3 bucket Restrict Amazon S3 access to the OAl

    D. Store the videos in an Amazon S3 bucket Create an AWS Storage Gateway file gateway to access the S3 bucket Create a user data script for the web servers to mount the file gateway.

  • Question 730:

    A company stores can wordings on a monthly basis Users access lie recorded files randomly within 1year of recording, but users rarely access the files after 1year. The company wants to optimize its solution by allowing only files that ant newer than 1year old to be queried and retrieved as quickly as possible. A delay in retrieving older fees is acceptable

    Which solution meets these requirements MOST cost-effectively?

    A. Store individual files in Amazon S3 Glacier Store search metadata in object tags that are created in S3 Glacier Query the S3 Glacier tags to retrieve the files from S3 Glacier.

    B. Store individual files in Amazon S3. Use S3 Lifecycle polices to move the ties to S3 Glacier after 1year. Query and retrieve the files that are in Amazon S3 by using Amazon Athena. Query and retrieve the files that are in S3 Glacier by using S3 Glacier Select.

    C. Store Individual files In Amazon S3 Store search metadata for each archive In Amazon S3 Use S3 Lifecycle policies to move the ties to S3 Glacier after 1 year Query and retrieve tie flies by searching for metadata from Amazon S3.

    D. Store individual files in Amazon S3 Use S3 Lifecycle policies to move the files to S3 Glacier after 1year. Store search metadata in Amazon RDS Query the Sea from Amazon RDS Retrieve the files from Amazon S3 or S3 Glacier

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your SAA-C02 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.