MCQOPTIONS
Saved Bookmarks
This section includes 95 Mcqs, each offering curated multiple-choice questions to sharpen your Apache Hadoop knowledge and support exam preparation. Choose a topic below to get started.
| 1. |
________ nodes that control the start and end of the workflow and workflow job execution path. |
| A. | Action |
| B. | Control |
| C. | Data |
| D. | SubDomain |
| Answer» C. Data | |
| 2. |
Nodes in the config _____________ must be completed successfully. |
| A. | oozie.wid.rerun.skip.nodes |
| B. | oozie.wf.rerun.skip.nodes |
| C. | oozie.wf.run.skip.nodes |
| D. | all of the mentioned |
| Answer» C. oozie.wf.run.skip.nodes | |
| 3. |
_____________ will skip the nodes given in the config with the same exit transition as before. |
| A. | ActionMega handler |
| B. | Action handler |
| C. | Data handler |
| D. | None of the mentioned |
| Answer» C. Data handler | |
| 4. |
Which of the following workflow definition language is XML based ? |
| A. | hpDL |
| B. | hDL |
| C. | hiDL |
| D. | none of the mentioned |
| Answer» B. hDL | |
| 5. |
___________ nodes are the mechanism by which a workflow triggers the execution of a computation/processing task. |
| A. | Server |
| B. | Client |
| C. | Mechanism |
| D. | Action |
| Answer» E. | |
| 6. |
Node names and transitions must be conform to the following pattern =[a-zA-Z][\-_a-zA-Z0-0]*=, of up to __________ characters long. |
| A. | 10 |
| B. | 15 |
| C. | 20 |
| D. | 25 |
| Answer» D. 25 | |
| 7. |
Workflow with id __________ should be in SUCCEEDED/KILLED/FAILED. |
| A. | wfId |
| B. | iUD |
| C. | iFD |
| D. | all of the mentioned |
| Answer» B. iUD | |
| 8. |
A workflow definition is a ______ with control flow nodes or action nodes. |
| A. | CAG |
| B. | DAG |
| C. | BAG |
| D. | None of the mentioned |
| Answer» C. BAG | |
| 9. |
A workflow definition must have one ________ node. |
| A. | start |
| B. | resume |
| C. | finish |
| D. | none of the mentioned |
| Answer» B. resume | |
| 10. |
A _______________ action can be configured to perform file system cleanup and directory creation before starting the mapreduce job. |
| A. | map |
| B. | reduce |
| C. | map-reduce |
| D. | none of the mentioned |
| Answer» D. none of the mentioned | |
| 11. |
If the failure is of ___________ nature, Oozie will suspend the workflow job. |
| A. | transient |
| B. | non-transient |
| C. | permanent |
| D. | all of the mentioned |
| Answer» C. permanent | |
| 12. |
If a computation/processing task -triggered by a workflow- fails to complete successfully, its transitions to : |
| A. | error |
| B. | ok |
| C. | true |
| D. | false |
| Answer» B. ok | |
| 13. |
All decision nodes must have a _____________ element to avoid bringing the workflow into an error state if none of the predicates evaluates to true. |
| A. | name |
| B. | default |
| C. | server |
| D. | client |
| Answer» C. server | |
| 14. |
A ___________ node enables a workflow to make a selection on the execution path to follow. |
| A. | fork |
| B. | decision |
| C. | start |
| D. | none of the mentioned |
| Answer» C. start | |
| 15. |
If one or more actions started by the workflow job are executing when the ________ node is reached, the actions will be killed. |
| A. | kill |
| B. | start |
| C. | end |
| D. | finsih |
| Answer» B. start | |
| 16. |
The ___________ attribute in the join node is the name of the workflow join node. |
| A. | name |
| B. | to |
| C. | down |
| D. | none of the mentioned |
| Answer» B. to | |
| 17. |
Which of the following can be seen as a switch-case statement ? |
| A. | fork |
| B. | decision |
| C. | start |
| D. | none of the mentioned |
| Answer» C. start | |
| 18. |
___________ properties can be overridden by specifying them in the job-xml file or configuration element. |
| A. | Pipe |
| B. | Decision |
| C. | Flag |
| D. | None of the mentioned |
| Answer» B. Decision | |
| 19. |
Falcon promotes decoupling of data set location from ___________ definition. |
| A. | Oozie |
| B. | Impala |
| C. | Kafka |
| D. | Thrift |
| Answer» B. Impala | |
| 20. |
Falcon provides seamless integration with : |
| A. | HCatalog |
| B. | metastore |
| C. | HBase |
| D. | Kafka |
| Answer» C. HBase | |
| 21. |
Apache Knox provides __________ REST API Access Point. |
| A. | Single |
| B. | Double |
| C. | Multiple |
| D. | Zero |
| Answer» B. Double | |
| 22. |
Apache Knox accesses Hadoop Cluster over : |
| A. | HTTP |
| B. | TCP |
| C. | ICMP |
| D. | None of the mentioned |
| Answer» B. TCP | |
| 23. |
HDT provides wizards for creating Java Classes for : |
| A. | Mapper |
| B. | Reducer |
| C. | Driver |
| D. | All of the mentioned |
| Answer» E. | |
| 24. |
HDT is used for listing running Jobs on __________ Cluster |
| A. | MR |
| B. | Hive |
| C. | Pig |
| D. | None of the mentioned |
| Answer» B. Hive | |
| 25. |
__________ is a columnar storage format for Hadoop. |
| A. | Ranger |
| B. | Parquet |
| C. | REEF |
| D. | None of the mentioned |
| Answer» C. REEF | |
| 26. |
Ripple is a browser based mobile phone emulator designed to aid in the development of _______ based mobile applications. |
| A. | Javascript |
| B. | Java |
| C. | C++ |
| D. | HTML5 |
| Answer» E. | |
| 27. |
__________ is an abstraction over Apache Hadoop YARN that reduces the complexity of developing distributed applications |
| A. | Wave |
| B. | Twill |
| C. | Usergrid |
| D. | None of the mentioned |
| Answer» C. Usergrid | |
| 28. |
___________ is a Java library for writing, testing, and running pipelines of MapReduce jobs on Apache Hadoop. |
| A. | cTakes |
| B. | Crunch |
| C. | CouchDB |
| D. | None of the mentioned |
| Answer» C. CouchDB | |
| 29. |
A _________ is a hosted, live, concurrent data structure for rich communication. |
| A. | Wave |
| B. | Twill |
| C. | Usergrid |
| D. | None of the mentioned |
| Answer» B. Twill | |
| 30. |
Which of the following project will create a SOA services framework ? |
| A. | DeltaCloud |
| B. | CXF |
| C. | DeltaSpike |
| D. | None of the mentioned |
| Answer» C. DeltaSpike | |
| 31. |
_________ class allows other programs to get incoming chunks fed to them over a socket by the collector. |
| A. | PipelineStageWriter |
| B. | PipelineWriter |
| C. | SocketTeeWriter |
| D. | None of the mentioned |
| Answer» D. None of the mentioned | |
| 32. |
__________ runs Demux parsers inside for convert unstructured data to semi-structured data, then load the key value pairs to HBase table. |
| A. | HCatWriter |
| B. | HBWriter |
| C. | HBaseWriter |
| D. | None of the mentioned |
| Answer» D. None of the mentioned | |
| 33. |
If Ambari Agent has any output in /var/log/ambari-agent/ambari-agent.out, it is indicative of a __________ problem. |
| A. | Less Severe |
| B. | Significant |
| C. | Extremely Severe |
| D. | None of the mentioned |
| Answer» C. Extremely Severe | |
| 34. |
Ambari provides a ________ API that enables integration with existing tools, such as Microsoft System Center |
| A. | RestLess |
| B. | Web Service |
| C. | RESTful |
| D. | None of the mentioned |
| Answer» D. None of the mentioned | |
| 35. |
Collectors write chunks to logs/*.chukwa files until a ___ MB chunk is reached. |
| A. | 64 |
| B. | 108 |
| C. | 256 |
| D. | 1024 |
| Answer» B. 108 | |
| 36. |
Chukwa is ___________ data collection system for managing large distributed systems. |
| A. | open source |
| B. | proprietary |
| C. | service based |
| D. | none of the mentioned |
| Answer» B. proprietary | |
| 37. |
The only metadata retained on a per-consumer basis is the position of the consumer in the log, called : |
| A. | offset |
| B. | partition |
| C. | chunks |
| D. | all of the mentioned |
| Answer» B. partition | |
| 38. |
_________ has stronger ordering guarantees than a traditional messaging system. |
| A. | kafka |
| B. | Slider |
| C. | Suz |
| D. | None of the mentioned |
| Answer» B. Slider | |
| 39. |
Which of the following is spatial information system ? |
| A. | Sling |
| B. | Solr |
| C. | SIS |
| D. | All of the mentioned |
| Answer» D. All of the mentioned | |
| 40. |
Stratos will be a polyglot _________ framework |
| A. | Daas |
| B. | PaaS |
| C. | Saas |
| D. | Raas |
| Answer» C. Saas | |
| 41. |
Which of the following supports random-writable and advance-able sparse bitsets ? |
| A. | Stratos |
| B. | Kafka |
| C. | Sqoop |
| D. | Lucene |
| Answer» E. | |
| 42. |
The ___________ project will create an ESB and component suite based on the Java Business Interface (JBI) standard – JSR 208. |
| A. | ServiceMix |
| B. | Samza |
| C. | Rave |
| D. | All of the mentioned |
| Answer» B. Samza | |
| 43. |
___________ is a distributed data warehouse system for Hadoop. |
| A. | Stratos |
| B. | Tajo |
| C. | Sqoop |
| D. | Lucene |
| Answer» C. Sqoop | |
| 44. |
____________ is an open-source version control system. |
| A. | Stratos |
| B. | Kafka |
| C. | Sqoop |
| D. | Subversion |
| Answer» E. | |
| 45. |
___________ is a distributed, fault-tolerant, and high-performance realtime computation system |
| A. | Knife |
| B. | Storm |
| C. | Sqoop |
| D. | Lucene |
| Answer» C. Sqoop | |
| 46. |
__________ is a cluster manager that provides resource sharing and isolation across cluster applications. |
| A. | Merlin |
| B. | Mesos |
| C. | Max |
| D. | Merge |
| Answer» C. Max | |
| 47. |
Which of the following is a data access framework ? |
| A. | Merge |
| B. | Lucene.NET |
| C. | MetaModel |
| D. | None of the mentioned |
| Answer» D. None of the mentioned | |
| 48. |
__________ is a library to support unit testing of Hadoop MapReduce jobs. |
| A. | Myfaces |
| B. | Muse |
| C. | modftp |
| D. | None of the mentioned |
| Answer» E. | |
| 49. |
Which of the following is a robust implementation of the OASIS WSDM ? |
| A. | Myfaces |
| B. | Muse |
| C. | modftp |
| D. | None of the mentioned |
| Answer» C. modftp | |
| 50. |
__________ is used for Logging for .NET framework. |
| A. | log4net |
| B. | logphp |
| C. | Lucene.NET |
| D. | All of the mentioned |
| Answer» D. All of the mentioned | |