Exam Details

  • Exam Code
    :DAS-C01
  • Exam Name
    :AWS Certified Data Analytics - Specialty (DAS-C01)
  • Certification
    :Amazon Certifications
  • Vendor
    :Amazon
  • Total Questions
    :285 Q&As
  • Last Updated
    :Apr 27, 2025

Amazon Amazon Certifications DAS-C01 Questions & Answers

  • Question 141:

    A company receives data from its vendor in JSON format with a timestamp in the file name. The vendor uploads the data to an Amazon S3 bucket, and the data is registered into the company's data lake for analysis and reporting. The company has configured an S3 Lifecycle policy to archive all files to S3 Glacier after 5 days.

    The company wants to ensure that its AWS Glue crawler catalogs data only from S3 Standard storage and ignores the archived files. A data analytics specialist must implement a solution to achieve this goal without changing the current S3 bucket configuration.

    Which solution meets these requirements?

    A. Use the exclude patterns feature of AWS Glue to identify the S3 Glacier files for the crawler to exclude.

    B. Schedule an automation job that uses AWS Lambda to move files from the original S3 bucket to a new S3 bucket for S3 Glacier storage.

    C. Use the excludeStorageClasses property in the AWS Glue Data Catalog table to exclude files on S3 Glacier storage.

    D. Use the include patterns feature of AWS Glue to identify the S3 Standard files for the crawler to include.

  • Question 142:

    A company analyzes historical data and needs to query data that is stored in Amazon S3. New data is generated daily as .csv files that are stored in Amazon S3. The company's analysts are using Amazon Athena to perform SQL queries against a recent subset of the overall data.

    The amount of data that is ingested into Amazon S3 has increased substantially over time, and the query latency also has increased.

    Which solutions could the company implement to improve query performance? (Choose two.)

    A. Use MySQL Workbench on an Amazon EC2 instance, and connect to Athena by using a JDBC or ODBC connector. Run the query from MySQL Workbench instead of Athena directly.

    B. Use Athena to extract the data and store it in Apache Parquet format on a daily basis. Query the extracted data.

    C. Run a daily AWS Glue ETL job to convert the data files to Apache Parquet and to partition the converted files. Create a periodic AWS Glue crawler to automatically crawl the partitioned data on a daily basis.

    D. Run a daily AWS Glue ETL job to compress the data files by using the .gzip format. Query the compressed data.

    E. Run a daily AWS Glue ETL job to compress the data files by using the .lzo format. Query the compressed data.

  • Question 143:

    A market data company aggregates external data sources to create a detailed view of product consumption in different countries. The company wants to sell this data to external parties through a subscription. To achieve this goal, the company needs to make its data securely available to external parties who are also AWS users.

    What should the company do to meet these requirements with the LEAST operational overhead?

    A. Store the data in Amazon S3. Share the data by using presigned URLs for security.

    B. Store the data in Amazon S3. Share the data by using S3 bucket ACLs.

    C. Upload the data to AWS Data Exchange for storage. Share the data by using presigned URLs for security.

    D. Upload the data to AWS Data Exchange for storage. Share the data by using the AWS Data Exchange sharing wizard.

  • Question 144:

    A company has a marketing department and a finance department. The departments are storing data in Amazon S3 in their own AWS accounts in AWS Organizations. Both departments use AWS Lake Formation to catalog and secure their

    data. The departments have some databases and tables that share common names.

    The marketing department needs to securely access some tables from the finance department.

    Which two steps are required for this process? (Choose two.)

    A. The finance department grants Lake Formation permissions for the tables to the external account for the marketing department.

    B. The finance department creates cross-account IAM permissions to the table for the marketing department role.

    C. The marketing department creates an IAM role that has permissions to the Lake Formation tables.

  • Question 145:

    A company has an encrypted Amazon Redshift cluster. The company recently enabled Amazon Redshift audit logs and needs to ensure that the audit logs are also encrypted at rest. The logs are retained for 1 year. The auditor queries the logs once a month.

    What is the MOST cost-effective way to meet these requirements?

    A. Encrypt the Amazon S3 bucket where the logs are stored by using AWS Key Management Service (AWS KMS). Copy the data into the Amazon Redshift cluster from Amazon S3 on a daily basis. Query the data as required.

    B. Disable encryption on the Amazon Redshift cluster, configure audit logging, and encrypt the Amazon Redshift cluster. Use Amazon Redshift Spectrum to query the data as required.

    C. Enable default encryption on the Amazon S3 bucket where the logs are stored by using AES-256 encryption. Copy the data into the Amazon Redshift cluster from Amazon S3 on a daily basis. Query the data as required.

    D. Enable default encryption on the Amazon S3 bucket where the logs are stored by using AES-256 encryption. Use Amazon Redshift Spectrum to query the data as required.

  • Question 146:

    A data analytics specialist is setting up workload management in manual mode for an Amazon Redshift environment. The data analytics specialist is defining query monitoring rules to manage system performance and user experience of an Amazon Redshift cluster.

    Which elements must each query monitoring rule include?

    A. A unique rule name, a query runtime condition, and an AWS Lambda function to resubmit any failed queries in off hours

    B. A queue name, a unique rule name, and a predicate-based stop condition

    C. A unique rule name, one to three predicates, and an action

    D. A workload name, a unique rule name, and a query runtime-based condition

  • Question 147:

    A company has a data lake on AWS that ingests sources of data from multiple business units and uses Amazon Athena for queries. The storage layer is Amazon S3 using the AWS Glue Data Catalog. The company wants to make the data available to its data scientists and business analysts. However, the company first needs to manage data access for Athena based on user roles and responsibilities.

    What should the company do to apply these access controls with the LEAST operational overhead?

    A. Define security policy-based rules for the users and applications by role in AWS Lake Formation.

    B. Define security policy-based rules for the users and applications by role in AWS Identity and Access Management (IAM).

    C. Define security policy-based rules for the tables and columns by role in AWS Glue.

    D. Define security policy-based rules for the tables and columns by role in AWS Identity and Access Management (IAM).

  • Question 148:

    A company owns facilities with IoT devices installed across the world. The company is using Amazon Kinesis Data Streams to stream data from the devices to Amazon S3. The company's operations team wants to get insights from the IoT data to monitor data quality at ingestion. The insights need to be derived in near-real time, and the output must be logged to Amazon DynamoDB for further analysis.

    Which solution meets these requirements?

    A. Connect Amazon Kinesis Data Analytics to analyze the stream data. Save the output to DynamoDB by using the default output from Kinesis Data Analytics.

    B. Connect Amazon Kinesis Data Analytics to analyze the stream data. Save the output to DynamoDB by using an AWS Lambda function.

    C. Connect Amazon Kinesis Data Firehose to analyze the stream data by using an AWS Lambda function. Save the output to DynamoDB by using the default output from Kinesis Data Firehose.

    D. Connect Amazon Kinesis Data Firehose to analyze the stream data by using an AWS Lambda function. Save the data to Amazon S3. Then run an AWS Glue job on schedule to ingest the data into DynamoDB.

  • Question 149:

    A company uses Amazon Redshift for its data warehousing needs. ETL jobs run every night to load data, apply business rules, and create aggregate tables for reporting. The company's data analysis, data science, and business intelligence teams use the data warehouse during regular business hours. The workload management is set to auto, and separate queues exist for each team with the priority set to NORMAL.

    Recently, a sudden spike of read queries from the data analysis team has occurred at least twice daily, and queries wait in line for cluster resources. The company needs a solution that enables the data analysis team to avoid query queuing without impacting latency and the query times of other teams.

    Which solution meets these requirements?

    A. Increase the query priority to HIGHEST for the data analysis queue.

    B. Configure the data analysis queue to enable concurrency scaling.

    C. Create a query monitoring rule to add more cluster capacity for the data analysis queue when queries are waiting for resources.

    D. Use workload management query queue hopping to route the query to the next matching queue.

  • Question 150:

    A marketing company has data in Salesforce, MySQL, and Amazon S3. The company wants to use data from these three locations and create mobile dashboards for its users. The company is unsure how it should create the dashboards and needs a solution with the least possible customization and coding.

    Which solution meets these requirements?

    A. Use Amazon Athena federated queries to join the data sources. Use Amazon QuickSight to generate the mobile dashboards.

    B. Use AWS Lake Formation to migrate the data sources into Amazon S3. Use Amazon QuickSight to generate the mobile dashboards.

    C. Use Amazon Redshift federated queries to join the data sources. Use Amazon QuickSight to generate the mobile dashboards.

    D. Use Amazon QuickSight to connect to the data sources and generate the mobile dashboards.

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your DAS-C01 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.