Exam Details

  • Exam Code
    :BDS-C00
  • Exam Name
    :AWS Certified Big Data - Speciality (BDS-C00)
  • Certification
    :Amazon Certifications
  • Vendor
    :Amazon
  • Total Questions
    :264 Q&As
  • Last Updated
    :Apr 25, 2025

Amazon Amazon Certifications BDS-C00 Questions & Answers

  • Question 201:

    A solutions architect for a logistics organization ships packages from thousands of suppliers to end customers. The architect is building a platform where suppliers can view the status of one or more of their shipments. Each supplier can have multiple roles that will only allow access to specific fields in the resulting information.

    Which strategy allows the appropriate level of access control and requires the LEAST amount of management work?

    A. Send the tracking data to Amazon Kinesis Streams. Use AWS Lambda to store the data in an Amazon DynamoDB Table. Generate temporary AWS credentials for the suppliers' users with AWS STS, specifying fine-grained security policies to limit access only to their applicable data.

    B. Send the tracking data to Amazon Kinesis Firehose. Use Amazon S3 notifications and AWS Lambda to prepare files in Amazon S3 with appropriate data for each supplier's roles. Generate temporary AWS credentials for the suppliers' users with AWS STS. Limit access to the appropriate files through security policies.

    C. Send the tracking data to Amazon Kinesis Streams. Use Amazon EMR with Spark Streaming to store the data in HBase. Create one table per supplier. Use HBase Kerberos integration with the suppliers' users. Use HBase ACL-based security to limit access for the roles to their specific table and columns.

    D. Send the tracking data to Amazon Kinesis Firehose. Store the data in an Amazon Redshift cluster. Create views for the suppliers' users and roles. Allow suppliers access to the Amazon Redshift cluster using a user limited to the applicable view.

  • Question 202:

    A media advertising company handles a large number of real-time messages sourced from over 200 websites. The company's data engineer needs to collect and process records in real time for analysis using Spark Streaming on Amazon Elastic MapReduce (EMR). The data engineer needs to fulfill a corporate mandate to keep ALL raw messages as they are received as a top priority.

    Which Amazon Kinesis configuration meets these requirements?

    A. Publish messages to Amazon Kinesis Firehose backed by Amazon Simple Storage Service (S3). Pull messages off Firehose with Spark Streaming in parallel to persistence to Amazon S3.

    B. Publish messages to Amazon Kinesis Streams. Pull messages off Streams with Spark Streaming in parallel to AWS Lambda pushing messages from Streams to Firehose backed by Amazon Simple Storage Service (S3).

    C. Publish messages to Amazon Kinesis Firehose backed by Amazon Simple Storage Service (S3). Use AWS Lambda to pull messages from Firehose to Streams for processing with Spark Streaming.

    D. Publish messages to Amazon Kinesis Streams, pull messages off with Spark Streaming, and write raw data to Amazon Simple Storage Service (S3) before and after processing.

  • Question 203:

    There are thousands of text files on Amazon S3. The total size of the files is 1 PB. The files contain retail order information for the past 2 years. A data engineer needs to run multiple interactive queries to manipulate the data. The Data Engineer has AWS access to spin up an Amazon EMR cluster. The data engineer needs to use an application on the cluster to process this data and return the results in interactive time frame.

    Which application on the cluster should the data engineer use?

    A. Oozie

    B. Apache Pig with Tachyon

    C. Apache Hive

    D. Presto

  • Question 204:

    A company hosts a portfolio of e-commerce websites across the Oregon, N. Virginia, Ireland, and Sydney AWS regions. Each site keeps log files that capture user behavior. The company has built an application that generates batches of product recommendations with collaborative filtering in Oregon. Oregon was selected because the flagship site is hosted there and provides the largest collection of data to train machine learning models against. The other regions do NOT have enough historic data to train accurate machine learning models.

    Which set of data processing steps improves recommendations for each region?

    A. Use the e-commerce application in Oregon to write replica log files in each other region.

    B. Use Amazon S3 bucket replication to consolidate log entries and build a single model in Oregon.

    C. Use Kinesis as a buffer for web logs and replicate logs to the Kinesis stream of a neighboring region.

    D. Use the CloudWatch Logs agent to consolidate logs into a single CloudWatch Logs group.

  • Question 205:

    A company that provides economics data dashboards needs to be able to develop software to display rich, interactive, data-driven graphics that run in web browsers and leverages the full stack of web standards (HTML, SVG, and CSS).

    Which technology provides the most appropriate support for this requirements?

    A. D3.js

    B. IPython/Jupyter

    C. R Studio

    D. Hue

  • Question 206:

    A solutions architect works for a company that has a data lake based on a central Amazon S3 bucket. The data contains sensitive information. The architect must be able to specify exactly which files each user can access. Users access the platform through a SAML federation Single Sign On platform.

    The architect needs to build a solution that allows fine grained access control, traceability of access to the

    objects, and usage of the standard tools (AWS Console, AWS CLI) to access the data.

    Which solution should the architect build?

    A. Use Amazon S3 Server-Side Encryption with AWS KMS-Managed Keys for storing data. Use AWS KMS Grants to allow access to specific elements of the platform. Use AWS CloudTrail for auditing.

    B. Use Amazon S3 Server-Side Encryption with Amazon S3-Managed Keys. Set Amazon S3 ACLs to allow access to specific elements of the platform. Use Amazon S3 to access logs for auditing.

    C. Use Amazon S3 Client-Side Encryption with Client-Side Master Key. Set Amazon S3 ACLs to allow access to specific elements of the platform. Use Amazon S3 to access logs for auditing.

    D. Use Amazon S3 Client-Side Encryption with AWS KMS-Managed Keys for storing data. Use AWS KMS Grants to allow access to specific elements of the platform. Use AWS CloudTrail for auditing.

  • Question 207:

    A company with a support organization needs support engineers to be able to search historic cases to provide fast responses on new issues raised. The company has forwarded all support messages into an Amazon Kinesis Stream. This meets a company objective of using only managed services to reduce operational overhead.

    The company needs an appropriate architecture that allows support engineers to search on historic cases and find similar issues and their associated responses.

    Which AWS Lambda action is most appropriate?

    A. Ingest and index the content into an Amazon Elasticsearch domain.

    B. Stem and tokenize the input and store the results into Amazon ElastiCache.

    C. Write data as JSON into Amazon DynamoDB with primary and secondary indexes.

    D. Aggregate feedback in Amazon S3 using a columnar format with partitioning.

  • Question 208:

    A company needs a churn prevention model to predict which customers will NOT renew their yearly subscription to the company's service. The company plans to provide these customers with a promotional offer. A binary classification model that uses Amazon Machine Learning is required.

    On which basis should this binary classification model be built?

    A. User profiles (age, gender, income, occupation)

    B. Last user session

    C. Each user time series events in the past 3 months

    D. Quarterly results

  • Question 209:

    A company generates a large number of files each month and needs to use AWS import/export to move these files into Amazon S3 storage. To satisfy the auditors, the company needs to keep a record of which files were imported into Amazon S3.

    What is a low-cost way to create a unique log for each import job?

    A. Use the same log file prefix in the import/export manifest files to create a versioned log file in Amazon S3 for all imports.

    B. Use the log file prefix in the import/export manifest files to create a unique log file in Amazon S3 for each import.

    C. Use the log file checksum in the import/export manifest files to create a unique log file in Amazon S3 for each import.

    D. Use a script to iterate over files in Amazon S3 to generate a log after each import/export job.

  • Question 210:

    An organization is designing an application architecture. The application will have over 100 TB of data and will support transactions that arrive at rates from hundreds per second to tens of thousands per second, depending on the day of the week and time of day. All transaction data, must be durably and reliably stored. Certain read operations must be performed with strong consistency.

    Which solution meets these requirements?

    A. Use Amazon DynamoDB as the data store and use strongly consistent reads when necessary.

    B. Use an Amazon Relational Database Service (RDS) instance sized to meet the maximum anticipated transaction rate and with the High Availability option enabled.

    C. Deploy a NoSQL data store on top of an Amazon Elastic MapReduce (EMR) cluster, and select the HDFS High Durability option.

    D. Use Amazon Redshift with synchronous replication to Amazon Simple Storage Service (S3) and row-level locking for strong consistency.

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your BDS-C00 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.