AllExam Dumps

DUMPS, FREE DUMPS, VCP5 DUMPS| VMWARE DUMPS, VCP DUMPS, VCP4 DUMPS, VCAP DUMPS, VCDX DUMPS, CISCO DUMPS, CCNA, CCNA DUMPS, CCNP DUMPS, CCIE DUMPS, ITIL, EXIN DUMPS,


READ Free Dumps For Cloudera- CCD-410





Question ID 12495

When can a reduce class also serve as a combiner without affecting the output of a
MapReduce program?

Option A

When the types of the reduce operations input key and input value match the types of the reducers output key and output value and when the reduce operation is both communicative and associative.

Option B

When the signature of the reduce method matches the signature of the combine method.

Option C

Always. Code can be reused in Java since it is a polymorphic object-oriented programming language.

Option D

Always. The point of a combiner is to serve as a mini-reducer directly after the map phase to increase performance.

Option E

Never. Combiners and reducers must be implemented separately because they serve different purposes.

Correct Answer A
Explanation Explanation: You can use your reducer code as a combiner if the operation performed is commutative and associative. Reference: 24 Interview Questions & Answers for Hadoop MapReduce developers, What are combiners? When should I use a combiner in my MapReduce Job?


Question ID 12496

You need to run the same job many times with minor variations. Rather than hardcoding all
job configuration options in your drive code, youve decided to have your Driver subclass
org.apache.hadoop.conf.Configured and implement the org.apache.hadoop.util.Tool
interface.
Indentify which invocation correctly passes.mapred.job.name with a value of Example to
Hadoop?

Option A

hadoop “mapred.job.name=Example” MyDriver input output

Option B

hadoop MyDriver mapred.job.name=Example input output

Option C

 hadoop MyDrive –D mapred.job.name=Example input output

Option D

hadoop setproperty mapred.job.name=Example MyDriver input output

Option E

 hadoop setproperty (“mapred.job.name=Example”) MyDriver input output

Correct Answer C
Explanation Explanation: Configure the property using the -D key=value notation: -D mapred.job.name='My Job' You can list a whole bunch of options by calling the streaming jar with just the -info argument Reference: Python hadoop streaming : Setting a job name

Send email to admin@getfreedumps for new dumps request!!!