Point out the Correct Statement.
The Hadoop framework publishes the job flow status to an internally running web server on the master nodes of the Hadoop cluster.
Excellent ! Your Answer is Correct.
Each incoming file is broken into 32MB by default
Data blocks are replicated across different nodes in the cluster to ensure a low degree of fault tolerance
None of the Option is Correct