Exam Details

  • Exam Code
    :SAA-C02
  • Exam Name
    :AWS Certified Solutions Architect - Associate (SAA-C02)
  • Certification
    :Amazon Certifications
  • Vendor
    :Amazon
  • Total Questions
    :1080 Q&As
  • Last Updated
    :Jun 04, 2025

Amazon Amazon Certifications SAA-C02 Questions & Answers

  • Question 881:

    A company needs to keep user transaction data in an Amazon DynamoDB table.

    The company must retain the data for 7 years.

    What is the MOST operationally efficient solution that meets these requirements?

    A. Use DynamoDB point-in-time recovery to back up the table continuously.

    B. Use AWS Backup to create backup schedules and retention policies for the table.

    C. Create an on-demand backup of the table by using the DynamoDB console. Store the backup in an Amazon S3 bucket. Set an S3 Lifecycle configuration for the S3 bucket.

    D. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to invoke an AWS Lambda function. Configure the Lambda function to back up the table and to store the backup in an Amazon S3 bucket. Set an S3 Lifecycle configuration for the S3 bucket.

  • Question 882:

    A company deploys Amazon EC2 instances that run in a VPC The EC2 instances load source data into Amazon S3 buckets so that the data can be processed in the future According to compliance laws, the data must not be transmitted over the public internet Servers in the company's on-premises data center will consume the output from an application that runs on the EC2 instances

    Which solution will meet these requirements?

    A. Deploy an interface VPC endpoint for Amazon EC2 Create an AWS Site-to-Site VPN connection between the company and the VPC

    B. Deploy a gateway VPC endpoint for Amazon S3 Set up an AWS Direct Connect connection between the on-premises network and the VPC

    C. Set up an AWS Transit Gateway connection from the VPC to the S3 buckets Create an AWS Site-to- Site VPN connection between the company and the VPC

    D. Set up proxy EC2 instances that have routes to NAT gateways Configure the proxy EC2 instances to fetch S3 data and feed the application instances

  • Question 883:

    A company's reporting system delivers hundreds of csv files to an Amazon S3 bucket each day The company must convert these files to Apache Parquet format and must store the files in a transformed data bucket. Which solution will meet these requirements with the LEAST development effort?

    A. Create an Amazon EMR cluster with Apache Spark installed Write a Spark application to transform the data Use EMR File System (EMRFS) to write files to the transformed data bucket

    B. Create an AWS Glue crawler to discover the data Create an AWS Glue extract transform: and load (ETL) job to transform the data Specify the transformed data bucket in the output step

    C. Use AWS Batch to create a job definition with Bash syntax to transform the data and output the data to the transformed data bucket Use the job definition to submit a job Specify an array job as the job type

    D. Create an AWS Lambda function to transform the data and output the data to the transformed data bucket. Configure an event notification for the S3 bucket. Specify the Lambda function as the destination for the event notification.

  • Question 884:

    A company stores data in an Amazon Aurora PostgreSQL DB cluster. The company must store all the data for 5 years and must delete all the data after 5 years. The company also must indefinitely keep audit logs of actions that are performed within the database. Currently, the company has automated backups configured for Aurora.

    Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.)

    A. Take a manual snapshot of the DB cluster.

    B. Create a lifecycle policy for the automated backups.

    C. Configure automated backup retention for 5 years.

    D. Configure an Amazon CloudWatch Logs export for the DB cluster.

    E. Use AWS Backup to take the backups and to keep the backups for 5 years.

  • Question 885:

    A hospital wants to create digital copies for its large collection of historical written records. The hospital will continue to add hundreds of new documents each day. The hospital's data team will scan the documents and will upload the documents to the AWS Cloud.

    A solutions architect must implement a solution to analyze the documents: extract the medical information, and store the documents so that an application can run SQL queries on the data The solution must maximize scalability and operational efficiency.

    Which combination of steps should the solutions architect take to meet these requirements? (Select TWO.)

    A. Write the document information to an Amazon EC2 instance that runs a MySQL database

    B. Write the document information to an Amazon S3 bucket Use Amazon Athena to query the data

    C. Create an Auto Scaling group of Amazon EC2 instances to run a custom application that processes the scanned files and extracts the medical information.

    D. Create an AWS Lambda function that runs when new documents are uploaded Use Amazon Rekognition to convert the documents to raw text Use Amazon Transcribe Medical to detect and extract relevant medical Information from the text.

    E. Create an AWS Lambda function that runs when new documents are uploaded Use Amazon Textract to convert the documents to raw text Use Amazon Comprehend Medical to detect and extract relevant medical information from the text

  • Question 886:

    An online retail company needs to run near-real-time analytics on website traffic to analyze top-selling products across different locations. The product purchase data and the user location details are sent to a third-party application that runs

    on premises The application processes the data and moves the data into the company's analytics engine.

    The company needs to implement a cloud-based solution to make the data available for near-real-time analytics.

    Which solution will meet these requirements with the LEAST operational overhead?

    A. Use Amazon Kinesis Data Streams to ingest the data Use AWS Lambda to transform the data Configure Lambda to write the data to Amazon Amazon OpenSearch Service (Amazon Elasticsearch Service)

    B. Configure Amazon Kinesis Data Streams to write the data to an Amazon S3 bucket Schedule an AWS Glue crawler job to enrich the data and update the AWS Glue Data Catalog Use Amazon Athena for analytics

    C. Configure Amazon Kinesis Data Streams to write the data to an Amazon S3 bucket Add an Apache Spark job on Amazon EMR to enrich the data in the S3 bucket and write the data to Amazon OpenSearch Service (Amazon Elasticsearch Service)

    D. Use Amazon Kinesis Data Firehose to ingest the data Enable Kinesis Data Firehose data transformation with AWS Lambda Configure Kinesis Data Firehose to write the data to Amazon OpenSearch Service (Amazon Elasticsearch Service).

  • Question 887:

    A company hosts an application on AWS. The application uses AWS Lambda functions and stores data in Amazon DynamoDB tables. The Lambda functions are connected to a VPC that does not have internet access.

    The traffic to access DynamoDB must not travel across the internet. The application must have write access to only specific DynamoDB tables.

    Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.)

    A. Attach a VPC endpoint policy for DynamoDB to allow write access to only the specific DynamoDB tables.

    B. Attach a security group to the interface VPC endpoint to allow write access to only the specific DynamoDB tables.

    C. Create a resource-based 1AM policy to grant write access to only the specific DynamoDB tables. Attach the policy to the DynamoDB tables.

    D. Create a gateway VPC endpoint for DynamoDB that is associated with the Lambda VPC. Ensure that the Lambda execution role can access the gateway VPC endpoint.

    E. Create an interface VPC endpoint for DynamoDB that is associated with the Lambda VPC. Ensure that the Lambda execution role can access the interface VPC endpoint.

  • Question 888:

    A company produces batch data that comes from different databases The company also produces live stream data from network sensors and application APIs. The company needs to consolidate all the data into one place for business analytics The company needs to process the incoming data and then stage the data in different Amazon S3 buckets Teams will later run onetime queries and import the data into a business intelligence tool to show key performance indicators (KPIs).

    Which combination of steps will meet these requirements with the LEAST operational overhead? (Select TWO.)

    A. Use Amazon Athena foe one-time queries Use Amazon QuickSight to create dashboards for KPIs

    B. Use Amazon Kinesis Data Analytics for one-time queries Use Amazon QuickSight to create dashboards for KPIs

    C. Create custom AWS Lambda functions to move the individual records from me databases to an Amazon Redshift duster

    D. Use an AWS Glue extract transform, and toad (ETL) job to convert the data into JSON format Load the data into multiple Amazon OpenSearch Service (Amazon Elasticsearch Service) dusters

    E. Use blueprints in AWS Lake Formation to identify the data that can be ingested into a data lake Use AWS Glue to crawl the source extract the data and load the data into Amazon S3 in Apache Parquet format

  • Question 889:

    A company uses Amazon S3 as its data lake. The company has a new partner that must use SFTP to upload data files A solutions architect needs to implement a highly available SFTP solution that minimizes operational overhead.

    Which solution will meet these requirements?

    A. Use AWS Transfer Family to configure an SFTP-enabled server with a publicly accessible endpoint Choose the S3 data lake as the destination

    B. Use Amazon S3 File Gateway as an SFTP server Expose the S3 File Gateway endpoint URL to the new partner Share the S3 File Gateway endpoint with the new partner

    C. Launch an Amazon EC2 instance in a private subnet in a VPC Instruct the new partner to upload files to the EC2 instance by using a VPN Run a cron job script on the EC2 instance to upload files to the S3 data lake

    D. Launch Amazon EC2 instances in a private subnet in a VPC Place a Network Load Balancer (NLB) in front of the EC2 instances Create an SFTP listener port for the NLB Share the NLB hostname with the new partner. Run a cron job script on the EC2 instances to upload files to the S3 data lake

  • Question 890:

    A hospital recently deployed a RESTful API with Amazon API Gateway and AWS Lambda The hospital uses API Gateway and Lambda to upload reports that are in PDF format and JPEG format The hospital needs to modify the Lambda code to identify protected health information (PHI) in the reports.

    Which solution will meet these requirements with the LEAST operational overhead?

    A. Use existing Python libraries to extract the text from the reports and to identify the PHI from the extracted text.

    B. Use Amazon Textract to extract the text from the reports Use Amazon SageMaker to identify the PHI from the extracted text.

    C. Use Amazon Textract to extract the text from the reports Use Amazon Comprehend Medical to identify the PHI from the extracted text

    D. Use Amazon Rekognition to extract the text from the reports Use Amazon Comprehend Medical to identify the PHI from the extracted text

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your SAA-C02 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.