A financial services company needs to aggregate daily stock trade data from the exchanges into a data store. The company requires that data be streamed directly into the data store, but also occasionally allows data to be modified using SQL. The solution should integrate complex, analytic queries running with minimal latency. The solution must provide a business intelligence dashboard that enables viewing of the top contributors to anomalies in stock prices.
Which solution meets the company's requirements?
A. Use Amazon Kinesis Data Firehose to stream data to Amazon S3. Use Amazon Athena as a data source for Amazon QuickSight to create a business intelligence dashboard.
B. Use Amazon Kinesis Data Streams to stream data to Amazon Redshift. Use Amazon Redshift as a data source for Amazon QuickSight to create a business intelligence dashboard.
C. Use Amazon Kinesis Data Firehose to stream data to Amazon Redshift. Use Amazon Redshift as a data source for Amazon QuickSight to create a business intelligence dashboard.
D. Use Amazon Kinesis Data Streams to stream data to Amazon S3. Use Amazon Athena as a data source for Amazon QuickSight to create a business intelligence dashboard.
A financial company hosts a data lake in Amazon S3 and a data warehouse on an Amazon Redshift cluster. The company uses Amazon QuickSight to build dashboards and wants to secure access from its on-premises Active Directory to Amazon QuickSight.
How should the data be secured?
A. Use an Active Directory connector and single sign-on (SSO) in a corporate network environment.
B. Use a VPC endpoint to connect to Amazon S3 from Amazon QuickSight and an IAM role to authenticate Amazon Redshift.
C. Establish a secure connection by creating an S3 endpoint to connect Amazon QuickSight and a VPC endpoint to connect to Amazon Redshift.
D. Place Amazon QuickSight and Amazon Redshift in the security group and use an Amazon S3 endpoint to connect Amazon QuickSight to Amazon S3.
A software company hosts an application on AWS, and new features are released weekly. As part of the application testing process, a solution must be developed that analyzes logs from each Amazon EC2 instance to ensure that the application is working as expected after each deployment. The collection and analysis solution should be highly available with the ability to display new information with minimal delays.
Which method should the company use to collect and analyze the logs?
A. Enable detailed monitoring on Amazon EC2, use Amazon CloudWatch agent to store logs in Amazon S3, and use Amazon Athena for fast, interactive log analytics.
B. Use the Amazon Kinesis Producer Library (KPL) agent on Amazon EC2 to collect and send data to Kinesis Data Streams to further push the data to Amazon OpenSearch Service (Amazon Elasticsearch Service) and visualize using Amazon QuickSight.
C. Use the Amazon Kinesis Producer Library (KPL) agent on Amazon EC2 to collect and send data to Kinesis Data Firehose to further push the data to Amazon OpenSearch Service (Amazon Elasticsearch Service) and OpenSearch Dashboards (Kibana).
D. Use Amazon CloudWatch subscriptions to get access to a real-time feed of logs and have the logs delivered to Amazon Kinesis Data Streams to further push the data to Amazon OpenSearch Service (Amazon Elasticsearch Service) and OpenSearch Dashboards (Kibana).
A real estate company has a mission-critical application using Apache HBase in Amazon EMR. Amazon EMR is configured with a single master node. The company has over 5 TB of data stored on an Hadoop Distributed File System (HDFS). The company wants a cost-effective solution to make its HBase data highly available.
Which architectural pattern meets company's requirements?
A. Use Spot Instances for core and task nodes and a Reserved Instance for the EMR master node. Configure the EMR cluster with multiple master nodes. Schedule automated snapshots using Amazon EventBridge.
B. Store the data on an EMR File System (EMRFS) instead of HDFS. Enable EMRFS consistent view. Create an EMR HBase cluster with multiple master nodes. Point the HBase root directory to an Amazon S3 bucket.
C. Store the data on an EMR File System (EMRFS) instead of HDFS and enable EMRFS consistent view. Run two separate EMR clusters in two different Availability Zones. Point both clusters to the same HBase root directory in the same Amazon S3 bucket.
D. Store the data on an EMR File System (EMRFS) instead of HDFS and enable EMRFS consistent view. Create a primary EMR HBase cluster with multiple master nodes. Create a secondary EMR HBase read-replica cluster in a separate Availability Zone. Point both clusters to the same HBase root directory in the same Amazon S3 bucket.
A company has several Amazon EC2 instances sitting behind an Application Load Balancer (ALB) The company wants its IT Infrastructure team to analyze the IP addresses coming into the company's ALB The ALB is configured to store access logs in Amazon S3 The access logs create about 1 TB of data each day, and access to the data will be infrequent The company needs a solution that is scalable, cost-effective and has minimal maintenance requirements.
Which solution meets these requirements?
A. Copy the data into Amazon Redshift and query the data
B. Use Amazon EMR and Apache Hive to query the S3 data
C. Use Amazon Athena to query the S3 data
D. Use Amazon Redshift Spectrum to query the S3 data
A large marketing company needs to store all of its streaming logs and create near-real-time dashboards. The dashboards will be used to help the company make critical business decisions and must be highly available.
Which solution meets these requirements?
A. Store the streaming logs in Amazon S3 with replication to an S3 bucket in a different Availability Zone. Create the dashboards by using Amazon QuickSight.
B. Deploy an Amazon Redshift cluster with at least three nodes in a VPC that spans two Availability Zones. Store the streaming logs and use the Redshift cluster as a source to create the dashboards by using Amazon QuickSight.
C. Store the streaming logs in Amazon S3 with replication to an S3 bucket in a different Availability Zone. Every time a new log is added in the bucket, invoke an AWS Lambda function to update the dashboards in Amazon QuickSight.
D. Store the streaming logs in Amazon OpenSearch Service deployed across three Availability Zones and with three dedicated master nodes. Create the dashboards by using OpenSearch Dashboards.
An online retail company uses Amazon OpenSearch Service to store product information and record website customer events. As customer interactions occur, the company needs to provide images of products to the company's suppliers. The suppliers access the data by using the same application, but each supplier must be able to access only its own subset of data.
Which combination of steps will meet these requirements in the MOST operationally efficient manner? (Choose two.)
A. Embed OpensSearch Service dashboards by using AWS Amplify.
B. Embed Amazon QuickSight dashboards by using the QuickSight SDK.
C. Authorize the dashboard users by using Amazon Cognito.
D. Authorize the dashboard users by using an IAM role that provides permission to retrieve embedding URLs.
E. Implement multi-tenancy for OpenSearch Service and Amazon QuickSight for each supplier.
An online retail company maintains an on-premises MySQL database of customer transactions. The company has selected Amazon Redshift as the data warehouse for its data analytics on AWS. To provide relevant purchase recommendations, the company needs to ensure that new customer transactions are inserted into Amazon Redshift in near-real time.
What is the MOST cost-effective way to replicate this data into Amazon Redshift?
A. Query new transactions from the on-premises database, and upload them to an Amazon S3 bucket. Use AWS Glue and issue COPY statements to write the data into an Amazon Redshift table.
B. Create an Amazon Kinesis data stream, and select an Amazon Redshift table as the target. Change the application code to put new customer records into the Kinesis data stream.
C. Use AWS Database Migration Service (AWS DMS) to migrate a full export of the on-premises database to Amazon S3. Submit a job to an Amazon EMR cluster to query the incremental changes between exports. Load the incremental changes into an Amazon Redshift table.
D. Use AWS Database Migration Service (AWS DMS) to create an ongoing migration task to replicate new transactions from the on-premises database to an Amazon Redshift table.
A data analytics specialist needs to encrypt the storage for an Amazon EMR cluster that processes data for a company's monthly financial report. The EMR cluster uses a security configuration template. The local disks of the EMR cluster are not encrypted.
What should the data analytics specialist do to encrypt the local disks?
A. Update the existing security configuration to enable encryption for the local disks. Restart the existing cluster for the configuration to take effect.
B. Create a new security configuration to enable encryption for the local disks. Create a new cluster that uses the new security configuration.
C. Create a new security configuration to enable encryption for the local disks. Restart the existing cluster with the new security configuration.
D. Update the existing security configuration to enable encryption for the local disks. Create a new cluster that uses the updated security configuration.
A logistics company has an application that generates status data for order shipments. The company uses Amazon Kinesis Data Firehose to ingest data into Amazon S3 in near-real time. The newly ingested data must be merged with historical data before it can be used for business analytics. The company wants a solution that supports data inserts, updates, and deletes with minimal time delays.
Which solution meets these requirements with the LEAST amount of operational effort?
A. Use AWS Glue Spark jobs to populate an Apache ORC table.
B. Use AWS Glue Spark jobs with an Apache Hudi connector to populate an Apache Hudi table.
C. Use Amazon EMR with an Apache Spark script to populate an Apache ORC table.
D. Use Amazon EMR with an Apache Spark script to populate an Apache Hudi table.
Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Amazon exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your DAS-C01 exam preparations and Amazon certification application, do not hesitate to visit our Vcedump.com to find your solutions here.