Explore topic-wise MCQs in Testing Subject.

This section includes 657 Mcqs, each offering curated multiple-choice questions to sharpen your Testing Subject knowledge and support exam preparation. Choose a topic below to get started.

1.

__________ is a framework for building Java Server application GUIs

A. Myfaces
B. Muse
C. Flume
D. BigTop
Answer» B. Muse
2.

Which of the following is Content Management and publishing system based on Cocoon ?

A. LibCloud
B. Kafka
C. Lenya
D. All of the mentioned
Answer» D. All of the mentioned
3.

__________ is a online NoSQL developed by Cloudera.

A. HCatalog
B. Hbase
C. Imphala
D. Oozie
Answer» C. Imphala
4.

To configure short-circuit local reads, you will need to enable ____________ on local Hadoop.

A. librayhadoop
B. libhadoop
C. libhad
D. none of the mentioned
Answer» C. libhad
5.

_______ is an open source set of libraries, tools, examples, and documentation engineered.

A. Kite
B. Kize
C. Ookie
D. All of the mentioned
Answer» B. Kize
6.

CDH process and control sensitive data and facilitate :

A. multi-tenancy
B. flexibilty
C. scalability
D. all of the mentioned
Answer» B. flexibilty
7.

Apache _________ is a project that enables development and consumption of REST style web services.

A. Wives
B. Wink
C. Wig
D. All of the mentioned
Answer» C. Wig
8.

__________ is a non-blocking, asynchronous, event driven high performance web framework.

A. AWS
B. AWF
C. AWT
D. ASW
Answer» C. AWT
9.

Cloudera ___________ includes CDH and an annual subscription license (per node) to Cloudera Manager and technical support.

A. Enterprise
B. Express
C. Standard
D. All of the mentioned
Answer» B. Express
10.

Which of the following is a standard compliant XML Query processor ?

A. Whirr
B. VXQuery
C. Knife
D. Lens
Answer» C. Knife
11.

__________ is a log collection and correlation software with reporting and alarming functionalities.

A. Lucene
B. ALOIS
C. Imphal
D. None of the mentioned
Answer» C. Imphal
12.

Cloudera Enterprise comes in ___________ edition .

A. One
B. Two
C. Three
D. Four
Answer» D. Four
13.

Amazon EMR also allows you to run multiple versions concurrently, allowing you to control your ___________ version upgrade.

A. Pig
B. Windows Server
C. Hive
D. Ubuntu
Answer» D. Ubuntu
14.

Microsoft .NET Library for Avro provides data serialization for the Microsoft ___________ environment

A. .NET
B. Hadoop
C. Ubuntu
D. None of the mentioned
Answer» B. Hadoop
15.

The Amazon EMR default input format for Hive is :.

A. org.apache.hadoop.hive.ql.io.CombineHiveInputFormat
B. org.apache.hadoop.hive.ql.iont.CombineHiveInputFormat
C. org.apache.hadoop.hive.ql.io.CombineFormat
D. All of the mentioned
Answer» B. org.apache.hadoop.hive.ql.iont.CombineHiveInputFormat
16.

The key _________ command - which is traditionally a bash script - is also re-implemented as hadoop.cmd.

A. start
B. hadoop
C. had
D. hadstrat
Answer» C. had
17.

Which of the following benefit is not a feature of HDInsight ?

A. High availability
B. High reliability
C. High cost
D. All of the mentioned
Answer» D. All of the mentioned
18.

Amazon EMR clusters can read and process Amazon _________ streams directly.

A. Kinet
B. kinematics
C. Kinesis
D. None of the mentioned
Answer» D. None of the mentioned
19.

Impala on Amazon EMR requires _________ running Hadoop 2.x or greater.

A. AMS
B. AMI
C. AWR
D. All of the mentioned
Answer» C. AWR
20.

Amazon EMR uses Hadoop processing combined with several __________ products.

A. AWS
B. ASQ
C. AMR
D. AWES
Answer» B. ASQ
21.

Hadoop clusters running on Amazon EMR use ______ instances as virtual Linux servers for the master and slave nodes

A. EC2
B. EC3
C. EC4
D. None of the mentioned
Answer» B. EC3
22.

___________ is an RPC framework that defines a compact binary serialization format used to persist data structures for later analysis.

A. Pig
B. Hive
C. Thrift
D. None of the mentioned
Answer» C. Thrift
23.

Which of the following hadoop file formats is supported by Impala ?

A. SequenceFile
B. Avro
C. RCFile
D. All of the mentioned
Answer» E.
24.

__________ node distributes code across the cluster.

A. Zookeeper
B. Nimbus
C. Supervisor
D. None of the mentioned
Answer» C. Supervisor
25.

____________ communicates with Nimbus through Zookeeper, starts and stops workers according to signals from Nimbus

A. Zookeeper
B. Nimbus
C. Supervisor
D. None of the mentioned
Answer» D. None of the mentioned
26.

___________ builds virtual machines of branches trunk and 0.3 for KVM, VMWare and VirtualBox.

A. Bigtop-trunk-packagetest
B. Bigtop-trunk-repository
C. Bigtop-VM-matrix
D. None of the mentioned
Answer» D. None of the mentioned
27.

Apache __________ is a generic cluster management framework used to build distributed systems

A. Helix
B. Gereition
C. FtpServer
D. None of the mentioned
Answer» B. Gereition
28.

The __________ data Mapper framework makes it easier to use a database with Java or .NET applications

A. iBix
B. Helix
C. iBATIS
D. iBAT
Answer» D. iBAT
29.

Kafka only provides a _________ order over messages within a partition.

A. partial
B. total
C. 0.3
D. none of the mentioned
Answer» C. 0.3
30.

Which of the following is java based tool for tracking, resolving and managing project dependencies ?

A. jclouds
B. JDO
C. ivy-
D. All of the mentioned
Answer» D. All of the mentioned
31.

Each kafka partition has one server which acts as the _________

A. leaders
B. followers
C. staters
D. all of the mentioned
Answer» B. followers
32.

__________ is the amount of time to keep a log segment before it is deleted.

A. log.cleaner.enable
B. log.retention
C. log.index.enable
D. log.flush.interval.message
Answer» C. log.index.enable
33.

Kafka uses key-value pairs in the ____________ file format for configuration.

A. RFC
B. Avro
C. Property
D. None of the mentioned
Answer» D. None of the mentioned
34.

Communication between the clients and the servers is done with a simple, high-performance, language agnostic _________ protocol.

A. IP
B. TCP
C. SMTP
D. ICMP
Answer» C. SMTP
35.

__________ provides the functionality of a messaging system.

A. Oozie
B. Kafka
C. Lucene
D. BigTop
Answer» C. Lucene
36.

InfoSphere DataStage has __________ levels of Parallelism.

A. 1
B. 2
C. 3
D. 4
Answer» D. 4
37.

DataStage RTI is real time integration pack for :

A. STD
B. ISD
C. EXD
D. None of the mentioned
Answer» C. EXD
38.

__________ is a name given to the version of DataStage that had a parallel processing architecture and parallel ETL jobs.

A. Enterprise Edition
B. Server Edition
C. MVS Edition
D. TX
Answer» B. Server Edition
39.

The IBM _____________ Platform provides all the foundational building blocks of trusted information, including data integration, data warehousing, master data management, big data and information governance.

A. InfoStream
B. InfoSphere
C. InfoSurface
D. InfoData
Answer» B. InfoSphere
40.

___________ is used for processing complex transactions and messages,

A. PX
B. Server Edition
C. MVS Edition
D. TX
Answer» E.
41.

EC2 capacity can be increased or decreased in real time from as few as one to more than ___________ virtual machines simultaneously.

A. 1000
B. 2000
C. 3000
D. None of the mentioned
Answer» B. 2000
42.

InfoSphere DataStage uses a client/server design where jobs are created and administered via a ________ client against central repository on a server

A. Ubuntu
B. Windows
C. Debian
D. Solaris
Answer» C. Debian
43.

InfoSphere ___________ provides you with the ability to flexibly meet your unique information integration requirements

A. Data Server
B. Information Server
C. Info Server
D. All of the mentioned
Answer» C. Info Server
44.

AMI is uploaded to the Amazon _______ and registered with Amazon EC2, creating a so-called AMI identifier (AMI ID).

A. S2
B. S3
C. S4
D. S5
Answer» B. S3
45.

DataStage originated at __________ a company that developed two notable products: UniVerse database and the DataStage ETL tool.

A. VMark
B. Vzen
C. Hatez
D. None of the mentioned
Answer» B. Vzen