Exam Details

  • Exam Code
    :SAA-C02
  • Exam Name
    :AWS Certified Solutions Architect - Associate (SAA-C02)
  • Certification
    :Amazon Certifications
  • Vendor
    :Amazon
  • Total Questions
    :1080 Q&As
  • Last Updated
    :Jun 04, 2025

Amazon Amazon Certifications SAA-C02 Questions & Answers

  • Question 911:

    A solution architect is using an AWS CloudFormation template to deploy a three-tier web application. The web application consist of a web tier and an application that stores and retrieves user data in Amazon DynamoDB tables. The web and application tiers are hosted on Amazon EC2 instances, and the database tier is not publicly accessible. The application EC2 instances need to access the Dynamo tables Without exposing API credentials in the template.

    What should the solution architect do to meet the requirements?

    A. Create an IAM role to read the DynamoDB tables. Associate the role with the application instances by referencing an instance profile.

    B. Create an IAM role that has the required permissions to read and write from the DynamoDB tables. Add the role to the EC2 instance profile, and associate the instances profile with the application instances.

    C. Use the parameter section in the AWS CloudFormation template to have the user input access and secret keys from an already-created IAM user that has the required permissions to read and write from the DynamoDB tables.

    D. Create an IAM user in the AWS CloudFormation template that has the required permissions to read and write from the DynamoDB tables. Use the GetAtt function to retrieve the access secret keys, and pass them to the application instances through the user data.

  • Question 912:

    A company maintains a searchable repository of items on its website. The data is stored in an Amazon RDS for MySQL database table that contains more than 10 million rows The database has 2 TB of General Purpose SSD storage There are millions of updates against this data every day through the company's website The company has noticed that some insert operations are taking 10 seconds or longer The company has determined that the database storage performance is the problem Which solution addresses this performance issue?

    A. Change the storage type to Provisioned IOPS SSD

    B. Change the DB instance to a memory optimized instance class

    C. Change the DB instance to a burstable performance instance class

    D. Enable Multi-AZ RDS read replicas with MySQL native asynchronous replication.

  • Question 913:

    A company needs to create an Amazon Elastic Kubernetes Service (Amazon EKS) cluster to host a digital media streaming application. The EKS cluster will use a managed node group that is backed by Amazon Elastic Block Store (Amazon EBS) volumes for storage. The company must encrypt all data at rest by using a customer managed key that is stored in AWS Key Management Service (AWS KMS)

    Which combination of actions will meet this requirement with the LEAST operational overhead? (Select TWO.)

    A. Use a Kubernetes plugin that uses the customer managed key to perform data encryption.

    B. After creation of the EKS cluster, locate the EBS volumes. Enable encryption by using the customer managed key.

    C. Enable EBS encryption by default in the AWS Region where the EKS cluster will be created. Select the customer managed key as the default key.

    D. Create the EKS cluster Create an 1AM role that has cuwlicy that grants permission to the customer managed key. Associate the role with the EKS cluster.

    E. Store the customer managed key as a Kubernetes secret in the EKS cluster. Use the customer managed key to encrypt the EBS volumes.

  • Question 914:

    A company recently launched a variety of new workloads on Amazon EC2 instances in its AWS account. The company needs to create a strategy to access and administer the instances remotely and securely. The company needs to implement a repeatable process that works with native AWS services and follows the AWS WellArchitected Framework.

    Which solution will meet these requirements with the LEAST operational overhead?

    A. Use the EC2 serial console to directly access the terminal interface of each instance for administration.

    B. Attach the appropriate 1AM role to each existing instance and new instance. Use AWS Systems Manager Session Manager to establish a remote SSH session.

    C. Create an administrative SSH key pair. Load the public key into each EC2 instance. Deploy a bastion host in a public subnet to provide a tunnel for administration of each instance.

    D. Establish an AWS Site-to-Site VPN connection. Instruct administrators to use their local on-premises machines to connect directly to the instances by using SSH keys across the VPN tunnel.

  • Question 915:

    A company stores its application logs in an Amazon CloudWatch Logs log group. A new policy requires the company to store all application logs in Amazon OpenSearch Service (Amazon Elasticsearch Service) in near-real lime. Which solution will meet this requirement with the LEAST operational overhead?

    A. Configure a CloudWatch Logs subscription to stream the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service).

    B. Create an AWS Lambda function. Use the log group to invoke the function to write the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service).

    C. Create an Amazon Kinesis Data Firehose delivery stream Configure the log group as the delivery stream's source. Configure Amazon OpenSearch Service (Amazon Elasticsearch Service) as the delivery stream's destination.

    D. Install and configure Amazon Kinesis Agent on each application server to deliver the logs to Amazon Kinesis Data Streams. Configure Kinesis Data Streams to deliver the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service)

  • Question 916:

    A company hosts a data lake on AWS. The data lake consists of data in Amazon S3 and Amazon RDS for PostgreSQL. The company needs a reporting solution that provides data visualization and includes all the data sources within the data lake. Only the company's management team should have full access to all the visualizations. The rest of the company should have only limited access.

    Which solution will meet these requirements?

    A. Create an analysis in Amazon QuickSight. Connect all the data sources and create new datasets. Publish dashboards to visualize the data. Share the dashboards with the appropriate IAM roles.

    B. Create an analysis in Amazon OuickSighl. Connect all the data sources and create new datasets. Publish dashboards to visualize the data. Share the dashboards with the appropriate users and groups.

    C. Create an AWS Glue table and crawler for the data in Amazon S3. Create an AWS Glue extract, transform, and load (ETL) job to produce reports. Publish the reports to Amazon S3. Use S3 bucket policies to limit access to the reports.

    D. Create an AWS Glue table and crawler for the data in Amazon S3. Use Amazon Athena Federated Query to access data within Amazon RDS for PoslgreSQL. Generate reports by using Amazon Athena. Publish the reports to Amazon S3. Use S3 bucket policies to limit access to the reports.

  • Question 917:

    A company is running a popular social media website. The website gives users the ability to upload images to share with other users. The company wants to make sure that the images do not contain inappropriate content. The company needs a solution that minimizes development effort.

    What should a solutions architect do to meet these requirements?

    A. Use Amazon Comprehend to detect inappropriate content. Use human review for low-confidence predictions.

    B. Use Amazon Rekognition to detect inappropriate content. Use human review for low-confidence predictions.

    C. Use Amazon SageMaker to detect inappropriate content. Use ground truth to label low-confidence predictions.

    D. Use AWS Fargate to deploy a custom machine learning model to detect inappropriate content. Use ground truth to label low-confidence predictions.

  • Question 918:

    A company has an AWS Glue extract. transform, and load (ETL) job that runs every day at the same time. The job processes XML data that is in an Amazon S3 bucket.

    New data is added to the S3 bucket every day. A solutions architect notices that AWS Glue is processing all the data during each run.

    What should the solutions architect do to prevent AWS Glue from reprocessing old data?

    A. Edit the job to use job bookmarks.

    B. Edit the job to delete data after the data is processed

    C. Edit the job by setting the NumberOfWorkers field to 1.

    D. Use a FindMatches machine learning (ML) transform.

  • Question 919:

    A company wants to build a data lake on AWS from data that is stored in an onpremises Oracle relational database. The data lake must receive ongoing updates from the on-premises database.

    Which solution will meet these requirements with the LEAST operational overhead?

    A. Use AWS DataSync to transfer the data to Amazon S3. Use AWS Glue to transform the data and integrate the data into a data lake.

    B. Use AWS Snowball to transfer the data to Amazon S3. Use AWS Batch to transform the data and integrate the data into a data lake.

    C. Use AWS Database Migration Service (AWS DMS) to transfer the data to Amazon S3 Use AWS Glue to transform the data and integrate the data into a data lake.

    D. Use an Amazon EC2 instance to transfer the data to Amazon S3. Configure the EC2 instance to transform the data and integrate the data into a data lake.

  • Question 920:

    A company is developing a new machine learning (ML) model solution on AWS. The models are developed as independent microservices that fetch approximately 1GB of model data from Amazon S3 at startup and load the data into memory Users access the models through an asynchronous API Users can send a request or a batch of requests and specify where the results should be sent

    The company provides models to hundreds of users. The usage patterns for the models are irregular. Some models could be unused for days or weeks Other models could receive batches of thousands of requests at a time

    Which design should a solutions architect recommend to meet these requirements?

    A. Direct the requests from the API to a Network Load Balancer (NLB) Deploy the models as AWS Lambda functions that are invoked by the NLB.

    B. Direct the requests from the API to an Application Load Balancer (ALB). Deploy the models as Amazon Elastic Container Service (Amazon ECS) services that read from an Amazon Simple Queue Service (Amazon SQS) queue Use AWS App Mesh to scale the instances of the ECS cluster based on the SQS queue size

    C. Direct the requests from the API into an Amazon Simple Queue Service (Amazon SQS) queue Deploy the models as AWS Lambda functions that are invoked by SQS events Use AWS Auto Scaling to increase the number of vCPUs for the Lambda functions based on the SQS queue size

    D. Direct the requests from the API into an Amazon Simple Queue Service (Amazon SQS) queue Deploy the models as Amazon Elastic Container Service (Amazon ECS) services that read from the queue Enable AWS Auto Scaling on Amazon ECS for both the cluster and copies of the service based on the queue size.

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your SAA-C02 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.