Exam Details

  • Exam Code
    :APACHE-HADOOP-DEVELOPER
  • Exam Name
    :Hadoop 2.0 Certification exam for Pig and Hive Developer
  • Certification
    :HCAHD
  • Vendor
    :Hortonworks
  • Total Questions
    :108 Q&As
  • Last Updated
    :May 07, 2024

Hortonworks HCAHD APACHE-HADOOP-DEVELOPER Questions & Answers

  • Question 91:

    You have just executed a MapReduce job. Where is intermediate data written to after being emitted from the Mapper's map method?

    A. Intermediate data in streamed across the network from Mapper to the Reduce and is never written to disk.

    B. Into in-memory buffers on the TaskTracker node running the Mapper that spill over and are written into HDFS.

    C. Into in-memory buffers that spill over to the local file system of the TaskTracker node running the Mapper.

    D. Into in-memory buffers that spill over to the local file system (outside HDFS) of the TaskTracker node running the Reducer

    E. Into in-memory buffers on the TaskTracker node running the Reducer that spill over and are written into HDFS.

  • Question 92:

    Which HDFS command uploads a local file X into an existing HDFS directory Y?

    A. hadoop scp X Y

    B. hadoop fs -localPut X Y

    C. hadoop fs-put X Y

    D. hadoop fs -get X Y

  • Question 93:

    You need to move a file titled "weblogs" into HDFS. When you try to copy the file, you can't. You know you have ample space on your DataNodes. Which action should you take to relieve this situation and store more files in HDFS?

    A. Increase the block size on all current files in HDFS.

    B. Increase the block size on your remaining files.

    C. Decrease the block size on your remaining files.

    D. Increase the amount of memory for the NameNode.

    E. Increase the number of disks (or size) for the NameNode.

    F. Decrease the block size on all current files in HDFS.

  • Question 94:

    Which one of the following statements describes a Pig bag. tuple, and map, respectively?

    A. Unordered collection of maps, ordered collection of tuples, ordered set of key/value pairs

    B. Unordered collection of tuples, ordered set of fields, set of key value pairs

    C. Ordered set of fields, ordered collection of tuples, ordered collection of maps

    D. Ordered collection of maps, ordered collection of bags, and unordered set of key/value pairs

  • Question 95:

    Which one of the following classes would a Pig command use to store data in a table defined in HCatalog?

    A. org.apache.hcatalog.pig.HCatOutputFormat

    B. org.apache.hcatalog.pig.HCatStorer

    C. No special class is needed for a Pig script to store data in an HCatalog table

    D. Pig scripts cannot use an HCatalog table

  • Question 96:

    Which two of the following statements are true about Pig's approach toward data? Choose 2 answers

    A. Accepts only data that has a key/value pair structure

    B. Accepts data whether it has metadata or not

    C. Accepts only data that is defined by metadata tables stored in a database

    D. Accepts tab-delimited text data only

    E. Accepts any data: structured or unstructured

  • Question 97:

    Given a directory of files with the following structure: line number, tab character, string: Example: 1abialkjfjkaoasdfjksdlkjhqweroij 2kadfjhuwqounahagtnbvaswslmnbfgy 3kjfteiomndscxeqalkzhtopedkfsikj You want to send each line as one record to your Mapper. Which InputFormat should you use to complete

    the line: conf.setInputFormat (____.class) ; ?

    A. SequenceFileAsTextInputFormat

    B. SequenceFileInputFormat

    C. KeyValueFileInputFormat

    D. BDBInputFormat

  • Question 98:

    Examine the following Hive statements:

    Assuming the statements above execute successfully, which one of the following statements is true?

    A. Hive reformats File1 into a structure that Hive can access and moves into to/user/joe/x/

    B. The file named File1 is moved to to/user/joe/x/

    C. The contents of File1 are parsed as comma-delimited rows and loaded into /user/joe/x/

    D. The contents of File1 are parsed as comma-delimited rows and stored in a database

  • Question 99:

    You want to perform analysis on a large collection of images. You want to store this data in HDFS and process it with MapReduce but you also want to give your data analysts and data scientists the ability to process the data directly from HDFS with an interpreted high- level programming language like Python. Which format should you use to store this data in HDFS?

    A. SequenceFiles

    B. Avro

    C. JSON

    D. HTML

    E. XML

    F. CSV

  • Question 100:

    You have a directory named jobdata in HDFS that contains four files: _first.txt, second.txt, .third.txt and #data.txt. How many files will be processed by the FileInputFormat.setInputPaths () command when it's given a path object representing this directory?

    A. Four, all files will be processed

    B. Three, the pound sign is an invalid character for HDFS file names

    C. Two, file names with a leading period or underscore are ignored

    D. None, the directory cannot be named jobdata

    E. One, no special characters can prefix the name of an input file

Tips on How to Prepare for the Exams

Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Hortonworks exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your APACHE-HADOOP-DEVELOPER exam preparations and Hortonworks certification application, do not hesitate to visit our Vcedump.com to find your solutions here.