Exam Details

  • Exam Code
    :SAA-C03
  • Exam Name
    :AWS Certified Solutions Architect - Associate (SAA-C03)
  • Certification
    :Amazon Certifications
  • Vendor
    :Amazon
  • Total Questions
    :1304 Q&As
  • Last Updated
    :Jun 07, 2025

Amazon Amazon Certifications SAA-C03 Questions & Answers

  • Question 841:

    A company is building a solution that will report Amazon EC2 Auto Scaling events across all the applications In an AWS account. The company needs to use a serverless solution to store the EC2 Auto Scaling status data in Amazon S3 The company then will use the data m Amazon S3 to provide near-real time updates in a dashboard The solution must not affect the speed of EC2 instance launches.

    How should the company move the data to Amazon S3 to meet these requirements?

    A. Use an Amazon CioudWatch metric stream to send the EC2 Auto Scaling status data to Amazon Kinesis Data Firehose Store the data in Amazon S3

    B. Launch an Amazon EMR duster to collect the EC2 Auto Scaling status data and send the data to Amazon Kinesis Data Firehose Store the data in Amazon S3

    C. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to invoke an AWS Lambda (unction on a schedule Configure the Lambda function to send the EC2 Auto Scaling status data directly to Amazon S3

    D. Use a bootstrap script during the launch of an EC2 instance to install Amazon Kinesis Agent Configure Kinesis Agent to collect the EC2 Auto Scaling status data and send the data to Amazon Kinesis Data Firehose Store the data in Amazon S3

  • Question 842:

    A company recently released a new type of internet-connected sensor. The company is expecting to sell thousands of sensors, which are designed to stream high volumes of data each second to a central location. A solutions architect must design a solution that ingests and stores data so that engineering teams can analyse it in near-real time with millisecond responsiveness.

    Which solution should the solution architect recommend?

    A. Use an Amazon SOS queue to ingest the data. Consume the data with an AWS Lambda function which then stores the data in Amazon Redshift

    B. Use on Amazon SQS queue to ingest the data. Consume the data with an AWS Lambda function which then stores the data In Amazon DynamoDB

    C. Use Amazon Kinases Data Streams to ingest the data. Consume the data with an AWS Lambda function, which then stores the data m Amazon Redshift

    D. Use Amazon Kinesis Data Streams to ingest the data. Consume the data with an AWS Lambda function, which then stores the data m Amazon DynamoDB

  • Question 843:

    A company wants to build a data lake on AWS from data that is stored in an onpremises Oracle relational database. The data lake must receive ongoing updates from the on-premises database. Which solution will meet these requirements with the LEAST operational overhead?

    A. Use AWS DataSync to transfer the data to Amazon S3. Use AWS Glue to transform the data and integrate the data into a data lake.

    B. Use AWS Snowball to transfer the data to Amazon S3. Use AWS Batch to transform the data and integrate the data into a data lake.

    C. Use AWS Database Migration Service (AWS DMS) to transfer the data to Amazon S3 Use AWS Glue to transform the data and integrate the data into a data lake.

    D. Use an Amazon EC2 instance to transfer the data to Amazon S3. Configure the EC2 instance to transform the data and integrate the data into a data lake.

  • Question 844:

    A company has an ordering application that stores customer information in Amazon RDS for MySQL. During regular business hours, employees run one-time queries for reporting purposes. Timeouts are occurring during order processing because the reporting queries are taking a long time to run. The company needs to eliminate the timeouts without preventing employees from performing queries.

    What should a solutions architect do to meet those requirements?

    A. Create a read replica Move reporting queries to the read replica.

    B. Create a read replica. Distribute the ordering application to the primary DB instance and the read replica.

    C. Migrate the ordering application to Amazon DynamoDB with on-demand capacity.

    D. Schedule the reporting queries for non-peak hours.

  • Question 845:

    A company must save ail the email messages that its employees send to customers for a period of 12 months. The messages are stored m a binary format and vary m size from 1 KB to 20 KB. The company has selected Amazon S3 as the storage service for the messages

    Which combination of steps will meet these requirements MOST cost-effectively9 (Select TWO.)

    A. Create an S3 bucket policy that denies the s3 Delete Object action.

    B. Create an S3 lifecycle configuration that deletes the messages after 12 months.

    C. Upload the messages to Amazon S3 Use S3 Object Lock in governance mode

    D. Upload the messages to Amazon S3. Use S3 Object Lock in compliance mode.

    E. Use S3 Inventory Create an AWS Batch job that periodically scans the inventory and deletes the messages after 12 months

  • Question 846:

    A company has an on-premises MySQL database that handles transactional data The company is migrating the database to the AWS Cloud The migrated database must maintain compatibility with the company's applications that use the database The migrated database also must scale automatically during periods of increased demand.

    Which migration solution will meet these requirements?

    A. Use native MySQL tools to migrate the database to Amazon RDS for MySQL Configure elastic storage scaling

    B. Migrate the database to Amazon Redshift by using the mysqldump utility Turn on Auto Scaling for the Amazon Redshift cluster

    C. Use AWS Database Migration Service (AWS DMS) to migrate the database to Amazon Aurora Turn on Aurora Auto Scaling.

    D. Use AWS Database Migration Service (AWS DMS) to migrate the database to Amazon DynamoDB Configure an Auto Scaling policy.

  • Question 847:

    A company needs to develop a repeatable solution to process time-ordered information from websites around the world. The company collects the data from the websites by using Amazon Kinesis Data Streams and stores the data in Amazon

    S3.

    The processing logic needs to collect events and handle data from the last 5 years.

    The processing logic also must generate results m an S3 bucket so that a business intelligence application can analyze and compare the results. The processing must be repeated multiple times.

    What should a solutions architect do to meet these requirements?

    A. Use Amazon S3 to collect events. Create an AWS Lambda function to process the events. Create different Lambda functions to handle repeated processing.

    B. Use Amazon EventBridge (Amazon CloudWatch Events) to collect events Set AWS Lambda as an event target. Use EventBridge (CloudWatch Events) to create an archive for the events and to replay the events.

    C. Use an Amazon Simple Queue Service (Amazon SQS) FIFO queue to collect events. Process the events by using Amazon EC2. Use AWS Step Function to create an archive for the events and to replay the events.

    D. Use Amazon Managed Streaming for Apache Kafka (Amazon MSK) to collect events. Process the events by using Amazon Elastic Kubemetes Service (Amazon EKS) Use Amazon MSK to create an archive for the events and to replay the events.

  • Question 848:

    A company deploys Amazon EC2 instances that run in a VPC The EC2 instances load source data into Amazon S3 buckets so that the data can be processed in the future According to compliance laws, the data must not be transmitted over the public internet Servers in the company's on-premises data center will consume the output from an application that runs on the EC2 instances

    Which solution will meet these requirements?

    A. Deploy an interface VPC endpoint for Amazon EC2 Create an AWS Site-to-Site VPN connection between the company and the VPC

    B. Deploy a gateway VPC endpoint for Amazon S3 Set up an AWS Direct Connect connection between the on-premises network and the VPC

    C. Set up an AWS Transit Gateway connection from the VPC to the S3 buckets Create an AWS Site-to- Site VPN connection between the company and the VPC

    D. Set up proxy EC2 instances that have routes to NAT gateways Configure the proxy EC2 instances to fetch S3 data and feed the application instances

  • Question 849:

    A company has a business system that generates hundreds of reports each day. The business system saves the reports to a network share in CSV format The company needs to store this data in the AWS Cloud in near-real time for analysis. Which solution will meet these requirements with the LEAST administrative overhead?

    A. Use AWS DataSync to transfer the files to Amazon S3 Create a scheduled task that runs at the end of each day.

    B. Create an Amazon S3 File Gateway Update the business system to use a new network share from the S3 File Gateway.

    C. Use AWS DataSync to transfer the files to Amazon S3 Create an application that uses the DataSync API in the automation workflow.

    D. Deploy an AWS Transfer for SFTP endpoint Create a script that checks for new files on the network share and uploads the new files by using SFTP.

  • Question 850:

    A company hosts a website on Amazon EC2 instances behind an Application Load Balancer (ALB) The website serves static content Website traffic is increasing, and the company is concerned about a potential increase in cost. What should a solutions architect do to reduce the cost of the website?

    A. Create an Amazon CloudFront distribution to cache static files at edge locations.

    B. Create an Amazon ElastiCache cluster Connect the ALB to the ElastiCache cluster to serve cached files.

    C. Create an AWS WAF web ACL, and associate it with the ALB Add a rule to the web ACL to cache static files.

    D. Create a second ALB in an alternative AWS Region Route user traffic to the closest Region to minimize data transfer costs.

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your SAA-C03 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.