Exam Details

  • Exam Code
    :SAP-C02
  • Exam Name
    :AWS Certified Solutions Architect - Professional (SAP-C02)
  • Certification
    :Amazon Certifications
  • Vendor
    :Amazon
  • Total Questions
    :733 Q&As
  • Last Updated
    :Jun 14, 2025

Amazon Amazon Certifications SAP-C02 Questions & Answers

  • Question 511:

    A company recently started hosting new application workloads in the AWS Cloud. The company is using Amazon EC2 instances. Amazon Elastic File System (Amazon EFS) file systems, and Amazon RDS DB instances.

    To meet regulatory and business requirements, the company must make the following changes for data backups:

    1.

    Backups must be retained based on custom daily, weekly, and monthly requirements.

    2.

    Backups must be replicated to at least one other AWS Region immediately after capture.

    3.

    The backup solution must provide a single source of backup status across the AWS environment.

    4.

    The backup solution must send immediate notifications upon failure of any resource backup.

    Which combination of steps will meet these requirements with the LEAST amount of operational overhead? (Select THREE.)

    A. Create an AWS Backup plan with a backup rule for each of the retention requirements.

    B. Configure an AWS Backup plan to copy backups to another Region.

    C. Create an AWS Lambda function to replicate backups to another Region and send notification if a failure occurs.

    D. Add an Amazon Simple Notification Service (Amazon SNS) topic to the backup plan to send a notification for finished jobs that have any status except BACKUP_JOB_COMPLETEO.

    E. Create an Amazon Data Lifecycle Manager (Amazon DLM) snapshot lifecycle policy for each of the retention requirements.

    F. Set up RDS snapshots on each database.

  • Question 512:

    A car rental company has built a serverless REST API to provide data to its mobile app. The app consists of an Amazon API Gateway API with a Regional endpoint, AWS Lambda functions and an Amazon Aurora MySQL Serverless DB cluster The company recently opened the API to mobile apps of partners A significant increase in the number of requests resulted causing sporadic database memory errors Analysis of the API traffic indicates that clients are making multiple HTTP GET requests for the same queries in a short period of time Traffic is concentrated during business hours, with spikes around holidays and other events

    The company needs to improve its ability to support the additional usage while minimizing the increase in costs associated with the solution.

    Which strategy meets these requirements?

    A. Convert the API Gateway Regional endpoint to an edge-optimized endpoint Enable caching in the production stage.

    B. Implement an Amazon ElastiCache for Redis cache to store the results of the database calls Modify the Lambda functions to use the cache

    C. Modify the Aurora Serverless DB cluster configuration to increase the maximum amount of available memory

    D. Enable throttling in the API Gateway production stage Set the rate and burst values to limit the incoming calls

  • Question 513:

    A solutions architect must provide a secure way for a team of cloud engineers to use the AWS CLI to upload objects into an Amazon S3 bucket Each cloud engineer has anIAM user.IAM access keys and a virtual multi-factor authentication (MFA) device TheIAM users for the cloud engineers are in a group that is named S3-access The cloud engineers must use MFA to perform any actions in Amazon S3

    Which solution will meet these requirements?

    A. Attach a policy to the S3 bucket to prompt the 1AM user for an MFA code when the 1AM user performs actions on the S3 bucket Use 1AM access keys with the AWS CLI to call Amazon S3

    B. Update the trust policy for the S3-access group to require principals to use MFA when principals assume the group Use 1AM access keys with the AWS CLI to call Amazon S3

    C. Attach a policy to the S3-access group to deny all S3 actions unless MFA is present Use 1AM access keys with the AWS CLI to call Amazon S3

    D. Attach a policy to the S3-access group to deny all S3 actions unless MFA is present Request temporary credentials from AWS Security Token Service (AWS STS) Attach the temporary credentials in a profile that Amazon S3 will reference when the user performs actions in Amazon S3

  • Question 514:

    A company with several AWS accounts is using AWS Organizations and service control policies (SCPs). An Administrator created the following SCP and has attached it to an organizational unit (OU) that contains AWS account 1111-11111111: Developers working in account 1111-1111-1111 complain that they cannot create Amazon S3 buckets. How should the Administrator address this problem?

    A. Add s3:CreateBucket withAllow effect to the SCP.

    B. Remove the account from the OU, and attach the SCP directly to account 1111-1111- 1111.

    C. Instruct the Developers to add Amazon S3 permissions to their IAM entities.

    D. Remove the SCP from account 1111-1111-1111.

  • Question 515:

    A solutions architect is auditing the security setup of an AWS Lambda function for a company. The Lambda function retrieves the latest changes from an Amazon Aurora database. The Lambda function and the database run in the same VPC. Lambda environment variables are providing the database credentials to the Lambda function.

    The Lambda function aggregates data and makes the data available in an Amazon S3 bucket that is configured for server-side encryption with AWS KMS managed encryption keys (SSE-KMS). The data must not travel across the internet. If any database credentials become compromised, the company needs a solution that minimizes the impact of the compromise.

    What should the solutions architect recommend to meet these requirements?

    A. Enable IAM database authentication on the Aurora DB cluster. Change the IAM role for the Lambda function to allow the function to access the database by using IAM database authentication. Deploy a gateway VPC endpoint for Amazon S3 in the VPC.

    B. Enable IAM database authentication on the Aurora DB cluster. Change the IAM role for the Lambda function to allow the function to access the database by using IAM database authentication. Enforce HTTPS on the connection to Amazon S3 during data transfers.

    C. Save the database credentials in AWS Systems Manager Parameter Store. Set up password rotation on the credentials in Parameter Store. Change the IAM role for the Lambda function to allow the function to access Parameter Store. Modify the Lambda function to retrieve the credentials from Parameter Store. Deploy a gateway VPC endpoint for Amazon S3 in the VPC.

    D. Save the database credentials in AWS Secrets Manager. Set up password rotation on the credentials in Secrets Manager. Change the IAM role for the Lambda function to allow the function to access Secrets Manager. Modify the Lambda function to retrieve the credentials Om Secrets Manager. Enforce HTTPS on the connection to Amazon S3 during data transfers.

  • Question 516:

    A media company has a 30-TB repository of digital news videos. These videos are stored on tape in an on-premises tape library and referenced by a Media Asset Management (MAM) system. The company wants to enrich the metadata for these videos in an automated fashion and put them into a searchable catalog by using a MAM feature. The company must be able to search based on information in the video, such as objects, scenery items, or people's faces. A catalog is available that contains faces of people who have appeared in the videos that include an image of each person. The company would like to migrate these videos to AWS.

    The company has a high-speed AWS Direct Connect connection with AWS and would like to move the MAM solution video content directly from its current file system.

    How can these requirements be met by using the LEAST amount of ongoing management overhead and causing MINIMAL disruption to the existing system?

    A. Set up an AWS Storage Gateway, file gateway appliance on-premises. Use the MAM solution to extract the videos from the current archive and push them into the file gateway. Use the catalog of faces to build a collection in Amazon Rekognition. Build an AWS Lambda function that invokes the Rekognition Javascript SDK to have Rekognition pull the video from the Amazon S3 files backing the file gateway, retrieve the required metadata, and push the metadata into the MAM solution.

    B. Set up an AWS Storage Gateway, tape gateway appliance on-premises. Use the MAM solution to extract the videos from the current archive and push them into the tape gateway. Use the catalog of faces to build a collection in Amazon Rekognition. Build an AWS Lambda function that invokes the Rekognition Javascript SDK to have Amazon Rekognition process the video in the tape gateway, retrieve the required metadata, and push the metadata into the MAM solution.

    C. Configure a video ingestion stream by using Amazon Kinesis Video Streams. Use the catalog of faces to build a collection in Amazon Rekognition. Stream the videos from the MAM solution into Kinesis Video Streams. Configure Amazon Rekognition to process the streamed videos. Then, use a stream consumer to retrieve the required metadata, and push the metadata into the MAM solution. Configure the stream to store the videos in Amazon S3.

    D. Set up an Amazon EC2 instance that runs the OpenCV libraries. Copy the videos, images, and face catalog from the on-premises library into an Amazon EBS volume mounted on this EC2 instance. Process the videos to retrieve the required metadata, and push the metadata into the MAM solution, while also copying the video files to an Amazon S3 bucket.

  • Question 517:

    A company runs applications on Amazon EC2 instances. The company plans to begin using an Auto Scaling group for the instances. As part of this transition, a solutions architect must ensure that Amazon CloudWatch Logs automatically collects logs from all new instances The new Auto Scaling group will use a launch template that includes the Amazon Linux 2 AMI and no key pair.

    Which solution meets these requirements?

    A. Create an Amazon CloudWatch agent configuration for the workload Store the CloudWatch agent configuration in an Amazon S3 bucket Write an EC2 user data script to fetch the configuration He from Amazon S3. Configure the cloudWatch agent on the instance during Initial boot.

    B. Create an Amazon CloudWatch agent configuration for the workload In AWS Systems Manager Parameter Store Create a Systems Manager document that Installs and configures the CloudWatch agent by using the configuration Create an Amazon EventBridge (Amazon CloudWatch Events) rule on the default event bus with a Systems Manager Run Command target that runs the document whenever an instance enters the running state.

    C. Create an Amazon CloudWatch agent configuration for the workload Create an AWS Lambda function to Install and configure CloudWatch agent by using AWS Systems Manager Session Manager. Include the agent configuration inside the Lambda package Create an AWS Config custom rule to identify changes to the EC2 instances and invoke the Lambda function

    D. Create an Amazon CloudWatch agent configuration for the workload. Save the CloudWatch agent configuration as pan of an AWS Lambda deployment package. Use AWS CloudTrail to capture EC2 tagging events and initiate agent installation. Use AWS CodeBuild to configure the CloudWatch agent on the instances that run the workload.

  • Question 518:

    A company processes environmental data. The company has set up sensors to provide a continuous stream of data from different areas in a city. The data is available in JSON format.

    The company wants to use an AWS solution to send the data to a database that does not require fixed schemas for storage. The data must be sent in real time.

    Which solution will meet these requirements?

    A. Use Amazon Kinesis Data Firehose to send the data to Amazon Redshift.

    B. Use Amazon Kinesis Data Streams to send the data to Amazon DynamoDB

    C. Use Amazon Managed Streaming for Apache Kafka {Amazon MSK) to send the data to Amazon Aurora.

    D. Use Amazon Kinesis Data Firehose to send the data to Amazon Keyspaces (for Apache Cassandra).

  • Question 519:

    An adventure company has launched a new feature on its mobile app. Users can use the feature to upload their hiking and ratting photos and videos anytime. The photos and videos are stored in Amazon S3 Standard storage in an S3 bucket and are served through Amazon CloudFront.

    The company needs to optimize the cost of the storage. A solutions architect discovers that most of the uploaded photos and videos are accessed infrequently after 30 days. However, some of the uploaded photos and videos are accessed frequently after 30 days. The solutions architect needs to implement a solution that maintains millisecond retrieval availability of the photos and videos at the lowest possible cost.

    Which solution will meet these requirements?

    A. Configure S3 Intelligent-Tiering on the S3 bucket.

    B. Configure an S3 Lifecycle policy to transition image objects and video objects from S3 Standard to S3 Glacier Deep Archive after 30 days.

    C. Replace Amazon S3 with an Amazon Elastic File System (Amazon EFS) file system that is mounted on Amazon EC2 instances.

    D. Add a Cache-Control: max-age header to the S3 image objects and S3 video objects. Set the header to 30 days.

  • Question 520:

    A large company has many business units Each business unit has multiple AWS accounts for different purposes. The CIO of the company sees that each business unit has data that would be useful to share with other parts of the company in total there are about 10 PB of data that needs to be shared with users in 1.000 AWS accounts. The data is proprietary so some of it should only be available to users with specific job types Some of the data is used for throughput of intensive workloads such as simulations. The number of accounts changes frequently because of new initiatives acquisitions and divestitures.

    A solutions architect has been asked to design a system that will allow for sharing data for use in AWS with all of the employees in the company.

    Which approach will allow for secure data sharing in scalable way?

    A. Store the data in a single Amazon S3 bucket Create an IAM role for every combination of job type and business unit that allows for appropriate read/write access based on object prefixes in the S3 bucket The roles should have trust policies that allow the business unit's AWS accounts to assume their roles Use IAM in each business unit's AWS account to prevent them from assuming roles for a different job type Users get credentials to access the data by using AssumeRole from their business unit's AWS account Users can then use those credentials with an S3 client

    B. Store the data in a single Amazon S3 bucket Write a bucket policy that uses conditions to grant read and write access where appropriate based on each user's business unit and job type. Determine the business unit with the AWS account accessing the bucket and the job type with a prefix in the IAM user's name Users can access data by using IAM credentials from their business unit's AWS account with an S3 client

    C. Store the data in a series of Amazon S3 buckets Create an application running m Amazon EC2 that is integrated with the company's identity provider (IdP) that authenticates users and allows them to download or upload data through the application The application uses the business unit and job type information in the IdP to control what users can upload and download through the application The users can access the data through the application's API

    D. Store the data in a series of Amazon S3 buckets Create an AWS STS token vending machine that is integrated with the company's identity provider (IdP) When a user logs in: have the token vending machine attach an IAM policy that assumes the role that limits the user's access and/or upload only the data the user is authorized to access Users can get credentials by authenticating to the token vending machine's website or API and then use those credentials with an S3 client

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your SAP-C02 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.