Exam Details

  • Exam Code
    :SAA-C02
  • Exam Name
    :AWS Certified Solutions Architect - Associate (SAA-C02)
  • Certification
    :Amazon Certifications
  • Vendor
    :Amazon
  • Total Questions
    :1080 Q&As
  • Last Updated
    :Jun 04, 2025

Amazon Amazon Certifications SAA-C02 Questions & Answers

  • Question 691:

    A company is running several business applications in three separate VPCs within the us- east-1 Region. The applications must be able to communicate between VPCs. The applications also must be able to consistently send hundreds of

    gigabytes of data each day to a latency-sensitive application that runs in a single on-premises data center.

    A solutions architect needs to design a network connectivity solution that maximizes cost- effectiveness.

    Which solution meets these requirements?

    A. Configure three AWS Site-to-Site VPN connections from the data center to AWS. Establish connectivity by configuring one VPN connection for each VPC.

    B. Launch a third-party virtual network appliance in each VPC. Establish an IPsec VPN tunnel between the data center and each virtual appliance.

    C. Set up three AWS Direct Connect connections from the data center to a Direct Connect gateway In us-easl-1. Establish connectivity by configuring each VPC to use one of the Direct Connect connections.

    D. Set up one AWS Direct Connect connection from the data center lo AWS Create a transit gateway, and attach each VPC to the transit gateway. Establish connectivity between the Direct Connect connection and the transit gateway.

  • Question 692:

    A company has a production web application in which users upload documents through a web interlace or a mobile app. According to a new regulatory requirement, new documents cannot be modified or deleted after they are stored. What should a solutions architect do to meet this requirement?

    A. Store the uploaded documents in an Amazon S3 bucket with S3 Versioning and S3 Object Lock enabled

    B. Store the uploaded documents in an Amazon S3 bucket. Configure an S3 Lifecycle policy to archive the documents periodically.

    C. Store the uploaded documents in an Amazon S3 bucket with S3 Versioning enabled Configure an ACL to restrict all access to read-only.

    D. Store the uploaded documents on an Amazon Elastic File System (Amazon EFS) volume. Access the data by mounting the volume in read-only mode.

  • Question 693:

    A company is running a multi-tier web application on AWS. The application runs its database on Amazon Aurora MySQL. The application and database tiers are in the us- easily Region.

    A database administrator who monitors the Aurora DB cluster finds that an intermittent increase in read traffic is creating high CPU utilization on the read replica. The result is increased read latency for the application. The memory and disk

    utilization of the DB instance are stable throughout the event of increased latency.

    What should a solutions architect do to improve the read scalability?

    A. Reboot the DB cluster

    B. Create a cross-Region read replica

    C. Configure Aurora Auto Scaling for the read replica

    D. Increase the provisioned read IOPS for the DB instance

  • Question 694:

    Some of the company's customers are retrieving records frequently, leading to an increase in costs for the company. The company wants to limit retrieved requests in the future. The company also wants to ensure that if one customer reaches its retrieval limit other customers will not affected.

    Which solution will meet these requirements?

    A. Set up server-side throttling limits for API Gateway.

    B. Limit DynamoDB read throughput on the table lo an amount that results m the maximum cost that the company is willing to incur.

    C. Set up a usage plan for API Gateway Implement throttling limits tor each customer. and distribute API keys to each customer

    D. Set up AWS Budgets. Monitor the usage of API Gateway and DynamoDB Configure an alarm to provide an alert when the cost exceeds a certain threshold each month

  • Question 695:

    An application runs on Amazon EC2 instances across multiple Availability Zones. The instances run in an Amazon EC2 Auto Scaling group behind an Application Load Balancer The application performs best when the CPU utilization of the EC2 instances is at or near 40%.

    What should a solutions architect do to maintain the desired performance across all instances in the group?

    A. Use a simple scaling policy to dynamically

    B. Amazon DynamoDB global tables

    C. Amazon RDS for MySQL with Multi-AZ enabled

    D. Amazon RDS for MySQL with a cross-Region snapshot copy

  • Question 696:

    A company has three AWS accounts Management Development and Production. These accounts use AWS services only in the us-east-1 Region All accounts have a VPC with VPC Flow Logs configured to publish data to an Amazon S3 bucket in each separate account For compliance reasons the company needs an ongoing method to aggregate all the VPC flow logs across all accounts into one destination S3 bucket in the Management account.

    What should a solutions architect do to meet these requirements with the LEAST operational overhead?

    A. Add S3 Same-Region Replication rules in each S3 bucket that stores VPC flow logs to replicate objects to the destination S3 bucket Configure the destination S3 bucket to allow objects to be received from the S3 buckets in other accounts

    B. Set up an 1AM user in the Management account Grant permissions to the 1AM user to access the S3 buckets that contain the VPC flow logs Run the aws s3 sync command in the AWS CLl to copy the objects to the destination S3 bucket

    C. Use an S3 inventory report to specify which objects in the S3 buckets to copy Perform an S3 batch operation to copy the objects into the destination S3 bucket in the Management account with a single request.

    D. Create an AWS Lambda function in the Management account Grant S3 GET permissions on the source S3 buckets Grant S3 PUT permissions on the destination S3 bucket Configure the function to invoke when objects are loaded in the source S3 buckets

  • Question 697:

    A company has NFS servers in an on-premises data center that need to periodically back up small amounts of data to Amazon S3. Which solution marts these requirement and is MOST cost-effective?

    A. Set up AWS Glue lo copy the data from the on-premises servers to Amazon S3.

    B. Set up an AWS DataSync agent on Vie on-premises servers, and sync the data lo Amazon S3

    C. Set up an SFTP sync using AWS Transfer for SFTP lo sync data from on premises lo Amazon S3

    D. Set up an AWS Direct Connect connection between the on-premises data center and a VPC, and copy the data to Amazon S3

  • Question 698:

    A company has an Amazon S3 bucket that contains confidential information in its production AWS account The company has turned on AWS CloudTrail for the account. The account sends a copy of its logs to Amazon CloudWatch Logs. The

    company has configured the S3 bucket to log read and write data events.

    A company auditor discovers that some objects in the S3 bucket have been deleted A solutions architect must provide the auditor with information about who deleted the objects

    What should the solutions architect do to provide this information?

    A. Create a CloudWatch Logs fitter to extract the S3 write API calls against the S3 bucket

    B. Query the CloudTrail togs with Amazon Athena to identify the S3 write API calls against the S3 bucket

    C. Use AWS Trusted Advisor to perform security checks for S3 writ?API calls that deleted the content

    D. Use AWS Config to track configuration changes on the S3 bucket Use these details to track the S3 write API calls that deleted the content

  • Question 699:

    A company has developed a new content-sharing application that runs on Amazon Elastic Container Service (Amazon ECS). The application runs on Amazon Linux Docker tasks that use the Amazon EC2 launch type. The application requires a storage solution that has the following characteristics:

    1.

    Accessibility (or multiple ECS tasks through bind mounts

    2.

    Resiliency across Availability Zones

    3.

    Burslable throughput of up to 3 Gbps

    4.

    Ability to be scaled up over time

    Which storage solution meets these requirements?

    A. Launch an Amazon FSx for Windows File Server Multi-AZ instance. Configure the ECS task definitions to mount the Amazon FSx instance volume at launch.

    B. Launch an Amazon Elastic File System (Amazon EFS) instance. Configure the ECS task definitions to mount the EFS Instance volume at launch.

    C. Create a Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volume with Multi-Attach set to enabled. Attach the EBS volume to the ECS EC2 instance Configure ECS task definitions to mount the EBS instance volume at launch.

    D. Launch an EC2 instance with several Provisioned IOPS SSD (k>2) Amazon Elastic Block Store (Amazon EBS) volumes attached m a RAID 0 configuration. Configure the EC2 instance as an NFS storage server. Configure ECS task definitions to mount the volumes at launch.

  • Question 700:

    A company has hired an external vendor to perform work in the company's AWS account The vendor uses an automated tool that is hosted in an AWS account that the vendor owns The vendor does not have 1AM access to the company's AWS account

    How should a solutions architect grant this access to the vendor?

    A. Create an lAM rote in the company's account to delegate access to the vendor's 1AM role Attach the appropriate 1AM policies to the role for the permissions that the vendor requires

    B. Create an lAM user in the company's account with a password that meets the password complexity requirements Attach the appropriate lAM policies to the user (or the permissions that the vendor requires

    C. Create an IAM group in the company's account Add the tool's lAM user from the vendor account lo the group Attach the appropriate lAM policies to the group for the permissions that the vendor requires

    D. Create a new identity provider by choosing "AWS account" as the provider type in the 1AM console Supply the vendor's AWS account ID and user name Attach the appropriate 1AM policies to the new provider for the permissions that the vendor requires

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your SAA-C02 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.