Exam Details

  • Exam Code
    :DBS-C01
  • Exam Name
    :AWS Certified Database - Specialty (DBS-C01)
  • Certification
    :Amazon Certifications
  • Vendor
    :Amazon
  • Total Questions
    :321 Q&As
  • Last Updated
    :Apr 22, 2025

Amazon Amazon Certifications DBS-C01 Questions & Answers

  • Question 121:

    A vehicle insurance company needs to choose a highly available database to track vehicle owners and their insurance details. The persisted data should be immutable in the database, including the complete and sequenced history of changes over time with all the owners and insurance transfer details for a vehicle.

    The data should be easily verifiable for the data lineage of an insurance claim.

    Which approach meets these requirements with MINIMAL effort?

    A. Create a blockchain to store the insurance details. Validate the data using a hash function to verify the data lineage of an insurance claim.

    B. Create an Amazon DynamoDB table to store the insurance details. Validate the data using AWS DMS validation by moving the data to Amazon S3 to verify the data lineage of an insurance claim.

    C. Create an Amazon QLDB ledger to store the insurance details. Validate the data by choosing the ledger name in the digest request to verify the data lineage of an insurance claim.

    D. Create an Amazon Aurora database to store the insurance details. Validate the data using AWS DMS validation by moving the data to Amazon S3 to verify the data lineage of an insurance claim.

  • Question 122:

    A company is migrating its on-premises database workloads to the AWS Cloud. A database specialist performing the move has chosen AWS DMS to migrate an Oracle database with a large table to Amazon RDS. The database specialist notices that AWS DMS is taking significant time to migrate the data. Which actions would improve the data migration speed? (Choose three.)

    A. Create multiple AWS DMS tasks to migrate the large table.

    B. Configure the AWS DMS replication instance with Multi-AZ.

    C. Increase the capacity of the AWS DMS replication server.

    D. Establish an AWS Direct Connect connection between the on-premises data center and AWS.

    E. Enable an Amazon RDS Multi-AZ configuration.

    F. Enable full large binary object (LOB) mode to migrate all LOB data for all large tables.

  • Question 123:

    A company has an existing system that uses a single-instance Amazon DocumentDB (with MongoDB compatibility) cluster. Read requests account for 75% of the system queries. Write requests are expected to increase by 50% after an upcoming global release. A database specialist needs to design a solution that improves the overall database performance without creating additional application overhead.

    Which solution will meet these requirements?

    A. Recreate the cluster with a shared cluster volume. Add two instances to serve both read requests and write requests.

    B. Add one read replica instance. Activate a shared cluster volume. Route all read queries to the read replica instance.

    C. Add one read replica instance. Set the read preference to secondary preferred.

    D. Add one read replica instance. Update the application to route all read queries to the read replica instance.

  • Question 124:

    A gaming company has recently acquired a successful iOS game, which is particularly popular during the holiday season. The company has decided to add a leaderboard to the game that uses Amazon DynamoDB. The application load is expected to ramp up over the holiday season.

    Which solution will meet these requirements at the lowest cost?

    A. DynamoDB Streams

    B. DynamoDB with DynamoDB Accelerator

    C. DynamoDB with on-demand capacity mode

    D. DynamoDB with provisioned capacity mode with Auto Scaling

  • Question 125:

    A manufacturing company stores its inventory details in an Amazon DynamoDB table in the us-east-2 Region. According to new compliance and regulatory policies, the company is required to back up all of its tables nightly and store these backups in the us-west-2 Region for disaster recovery for 1 year.

    Which solution MOST cost-effectively meets these requirements?

    A. Convert the existing DynamoDB table into a global table and create a global table replica in the us-west-2 Region.

    B. Use AWS Backup to create a backup plan. Configure cross-Region replication in the plan and assign the DynamoDB table to this plan

    C. Create an on-demand backup of the DynamoDB table and restore this backup in the us- west-2 Region.

    D. Enable Amazon S3 Cross-Region Replication (CRR) on the S3 bucket where DynamoDB on-demand backups are stored.

  • Question 126:

    A company's application team needs to select an AWS managed database service to store application and user data. The application team is familiar with MySQL but is open to new solutions. The application and user data is stored in 10

    tables and is de-normalized. The application will access this data through an API layer using an unique ID in each table. The company expects the traffic to be light at first, but the traffic Will Increase to thousands of transactions each second

    within the first year- The database service must support active reads and writes in multiple AWS Regions at the same time_ Query response times need to be less than 100 ms.

    Which AWS database solution will meet these requirements?

    A. Deploy an Amazon RDS for MySQL environment in each Region and leverage AWS Database Migration Service (AWS DMS) to set up a multi-Region bidirectional replication

    B. Deploy an Amazon Aurora MySOL global database with write forwarding turned on

    C. Deploy an Amazon DynamoDB database with global tables

    D. Deploy an Amazon DocumentDB global cluster across multiple Regions.

  • Question 127:

    A development team at an international gaming company is experimenting with Amazon DynamoDB to store in-game events for three mobile games. The most popular game hosts a maximum of 500,000 concurrent users, and the least popular game hosts a maximum of 10,000 concurrent users. The average size of an event is 20 KB, and the average user session produces one event each second. Each event is tagged with a time in milliseconds and a globally unique identifier.

    The lead developer created a single DynamoDB table for the events with the following schema:

    Partition key: game name Sort key: event identifier Local secondary index: player identifier Event time The tests were successful in a small-scale development environment. However, when deployed to production, new events stopped being added to the table and the logs show DynamoDB failures with the ItemCollectionSizeLimitExceededException error code.

    Which design change should a database specialist recommend to the development team?

    A. Use the player identifier as the partition key. Use the event time as the sort key. Add a global secondary index with the game name as the partition key and the event time as the sort key.

    B. Create two tables. Use the game name as the partition key in both tables. Use the event time as the sort key for the first table. Use the player identifier as the sort key for the second table.

    C. Replace the sort key with a compound value consisting of the player identifier collated with the event time, separated by a dash. Add a local secondary index with the player identifier as the sort key.

    D. Create one table for each game. Use the player identifier as the partition key. Use the event time as the sort key.

  • Question 128:

    A company's ecommerce website uses Amazon DynamoDB for purchase orders. Each order is made up of a Customer ID and an Order ID. The DynamoDB table uses the Customer ID as the partition key and the Order ID as the sort key.

    To meet a new requirement, the company also wants the ability to query the table by using a third attribute named Invoice ID. Queries using the Invoice ID must be strongly consistent. A database specialist must provide this capability with optimal performance and minimal overhead.

    What should the database administrator do to meet these requirements?

    A. Add a global secondary index on Invoice ID to the existing table.

    B. Add a local secondary index on Invoice ID to the existing table.

    C. Recreate the table by using the latest snapshot while adding a local secondary index on Invoice ID.

    D. Use the partition key and a FilterExpression parameter with a filter on Invoice ID for all queries.

  • Question 129:

    A database expert is responsible for building a highly available online transaction processing (OLTP) solution that makes use of Amazon RDS for MySQL production databases. Disaster recovery criteria include a cross-regional deployment and an RPO and RTO of 5 and 30 minutes, respectively.

    What should the database professional do to ensure that the database meets the criteria for high availability and disaster recovery?

    A. Use a Multi-AZ deployment in each Region.

    B. Use read replica deployments in all Availability Zones of the secondary Region.

    C. Use Multi-AZ and read replica deployments within a Region.

    D. Use Multi-AZ and deploy a read replica in a secondary Region.

  • Question 130:

    A company is running its customer feedback application on Amazon Aurora MySQL. The company runs a report every day to extract customer feedback, and a team reads the feedback to determine if the customer comments are positive or negative. It sometimes takes days before the company can contact unhappy customers and take corrective measures. The company wants to use machine learning to automate this workflow.

    Which solution meets this requirement with the LEAST amount of effort?

    A. Export the Aurora MySQL database to Amazon S3 by using AWS Database Migration Service (AWS DMS). Use Amazon Comprehend to run sentiment analysis on the exported files.

    B. Export the Aurora MySQL database to Amazon S3 by using AWS Database Migration Service (AWS DMS). Use Amazon SageMaker to run sentiment analysis on the exported files.

    C. Set up Aurora native integration with Amazon Comprehend. Use SQL functions to extract sentiment analysis.

    D. Set up Aurora native integration with Amazon SageMaker. Use SQL functions to extract sentiment analysis.

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your DBS-C01 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.