Exam Details

  • Exam Code
    :BDS-C00
  • Exam Name
    :AWS Certified Big Data - Speciality (BDS-C00)
  • Certification
    :Amazon Certifications
  • Vendor
    :Amazon
  • Total Questions
    :264 Q&As
  • Last Updated
    :Apr 25, 2025

Amazon Amazon Certifications BDS-C00 Questions & Answers

  • Question 241:

    A data engineer is running a DWH on a 25-node Redshift cluster of a SaaS service. The data engineer needs to build a dashboard that will be used by customers. Five big customers represent 80% of usage, and there is a long tail of dozens of smaller customers. The data engineer has selected the dashboarding tool.

    How should the data engineer make sure that the larger customer workloads do NOT interfere with the smaller customer workloads?

    A. Apply query filters based on customer-id that can NOT be changed by the user and apply distribution keys on customer-id.

    B. Place the largest customers into a single user group with a dedicated query queue and place the rest of the customers into a different query queue.

    C. Push aggregations into an RDS for Aurora instance. Connect the dashboard application to Aurora rather than Redshift for faster queries.

    D. Route the largest customers to a dedicated Redshift cluster. Raise the concurrency of the multi-tenant Redshift cluster to accommodate the remaining customers.

  • Question 242:

    A large oil and gas company needs to provide near real-time alerts when peak thresholds are exceeded in its pipeline system. The company has developed a system to capture pipeline metrics such as flow rate, pressure, and temperature using millions of sensors. The sensors deliver to AWS IoT.

    What is a cost-effective way to provide near real-time alerts on the pipeline metrics?

    A. Create an AWS IoT rule to generate an Amazon SNS notification.

    B. Store the data points in an Amazon DynamoDB table and poll if for peak metrics data from an Amazon EC2 application.

    C. Create an Amazon Machine Learning model and invoke it with AWS Lambda.

    D. Use Amazon Kinesis Streams and a KCL-based application deployed on AWS Elastic Beanstalk.

  • Question 243:

    A company is using Amazon Machine Learning as part of a medical software application. The application

    will predict the most likely blood type for a patient based on a variety of other clinical tests that are

    available when blood type knowledge is unavailable.

    What is the appropriate model choice and target attribute combination for this problem?

    A. Multi-class classification model with a categorical target attribute.

    B. Regression model with a numeric target attribute.

    C. Binary Classification with a categorical target attribute.

    D. K-Nearest Neighbors model with a multi-class target attribute.

  • Question 244:

    A data engineer is about to perform a major upgrade to the DDL contained within an Amazon Redshift cluster to support a new data warehouse application. The upgrade scripts will include user permission updates, view and table structure changes as well as additional loading and data manipulation tasks.

    The data engineer must be able to restore the database to its existing state in the event of issues.

    Which action should be taken prior to performing this upgrade task?

    A. Run an UNLOAD command for all data in the warehouse and save it to S3.

    B. Create a manual snapshot of the Amazon Redshift cluster.

    C. Make a copy of the automated snapshot on the Amazon Redshift cluster.

    D. Call the waitForSnapshotAvailable command from either the AWS CLI or an AWS SDK.

  • Question 245:

    A game company needs to properly scale its game application, which is backed by DynamoDB. Amazon Redshift has the past two years of historical data. Game traffic varies throughout the year based on various factors such as season, movie release, and holiday season. An administrator needs to calculate how much read and write throughput should be provisioned for DynamoDB table for each week in advance.

    How should the administrator accomplish this task?

    A. Feed the data into Amazon Machine Learning and build a regression model.

    B. Feed the data into Spark Mlib and build a random forest modest.

    C. Feed the data into Apache Mahout and build a multi-classification model.

    D. Feed the data into Amazon Machine Learning and build a binary classification model.

  • Question 246:

    An Amazon EMR cluster using EMRFS has access to petabytes of data on Amazon S3, originating from multiple unique data sources. The customer needs to query common fields across some of the data sets to be able to perform interactive joins and then display results quickly.

    Which technology is most appropriate to enable this capability?

    A. Presto

    B. MicroStrategy

    C. Pig

    D. R Studio

  • Question 247:

    A social media customer has data from different data sources including RDS running MySQL, Redshift, and Hive on EMR. To support better analysis, the customer needs to be able to analyze data from different data sources and to combine the results.

    What is the most cost-effective solution to meet these requirements?

    A. Load all data from a different database/warehouse to S3. Use Redshift COPY command to copy data to Redshift for analysis.

    B. Install Presto on the EMR cluster where Hive sits. Configure MySQL and PostgreSQL connector to select from different data sources in a single query.

    C. Spin up an Elasticsearch cluster. Load data from all three data sources and use Kibana to analyze.

    D. Write a program running on a separate EC2 instance to run queries to three different systems. Aggregate the results after getting the responses from all three systems.

  • Question 248:

    A data engineer wants to use an Amazon Elastic Map Reduce for an application. The data engineer needs to make sure it complies with regulatory requirements. The auditor must be able to confirm at any point which servers are running and which network access controls are deployed.

    Which action should the data engineer take to meet this requirement?

    A. Provide the auditor IAM accounts with the SecurityAudit policy attached to their group.

    B. Provide the auditor with SSH keys for access to the Amazon EMR cluster.

    C. Provide the auditor with CloudFormation templates.

    D. Provide the auditor with access to AWS DirectConnect to use their existing tools.

  • Question 249:

    An administrator needs to design the event log storage architecture for events from mobile devices. The event data will be processed by an Amazon EMR cluster daily for aggregated reporting and analytics before being archived.

    How should the administrator recommend storing the log data?

    A. Create an Amazon S3 bucket and write log data into folders by device. Execute the EMR job on the device folders.

    B. Create an Amazon DynamoDB table partitioned on the device and sorted on date, write log data to table. Execute the EMR job on the Amazon DynamoDB table.

    C. Create an Amazon S3 bucket and write data into folders by day. Execute the EMR job on the daily folder.

    D. Create an Amazon DynamoDB table partitioned on EventID, write log data to table. Execute the EMR job on the table.

  • Question 250:

    A company operates an international business served from a single AWS region. The company wants to expand into a new country. The regulator for that country requires the Data Architect to maintain a log of financial transactions in the country within 24 hours of the product transaction. The production application is latency insensitive. The new country contains another AWS region.

    What is the most cost-effective way to meet this requirement?

    A. Use CloudFormation to replicate the production application to the new region.

    B. Use Amazon CloudFront to serve application content locally in the country; Amazon CloudFront logs will satisfy the requirement.

    C. Continue to serve customers from the existing region while using Amazon Kinesis to stream transaction data to the regulator.

    D. Use Amazon S3 cross-region replication to copy and persist production transaction logs to a bucket in the new country's region.

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your BDS-C00 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.