0% found this document useful (0 votes)
27 views

11 Hadoop Practise Session

This document provides instructions and details for practicing with Hadoop and HDFS. It outlines steps to start a Hadoop cluster, format it, check daemon status, create and copy/move files, change block sizes through configuration and commands, and visualize changes through the web interface. It also lists the default ports, protocols, and parameters for common Hadoop daemons like the namenode, datanode, jobtracker and tasktracker. Finally, it outlines the web UI ports and parameters for accessing the interfaces of these daemons.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views

11 Hadoop Practise Session

This document provides instructions and details for practicing with Hadoop and HDFS. It outlines steps to start a Hadoop cluster, format it, check daemon status, create and copy/move files, change block sizes through configuration and commands, and visualize changes through the web interface. It also lists the default ports, protocols, and parameters for common Hadoop daemons like the namenode, datanode, jobtracker and tasktracker. Finally, it outlines the web UI ports and parameters for accessing the interfaces of these daemons.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Hadoop – HDFS Practise Session

Practise
Start Hadoop Cluster

Format the cluster

Check if the Daemons are all running

Create folders into HDFS

Copy/put some files

Get some some files

Change block size


Thru configuration
Thru runtime/command

Sunday 29 May 2016 Hadoop - HDFS Practise Session 2


Practise

Change block size


Thru configuration
Thru runtime/command

Visualize the HDFS changes thru Web interface

Sunday 29 May 2016 Hadoop - HDFS Practise Session 3


Daemon Ports, Protocols & Parameters

Daemon Defau Configuration Protocol Used for


lt Parameter
Port
Namenode 8020 fs.default.name IPC: ClientProtocol FS
metadata
ops.

Datanode 50010 dfs.datanode.addr Custom Hadoop Xceiver: DFS data


ess DataNode and DFSClient transfer

Datanode 50020 dfs.datanode.ipc.a IPC: Block


ddress InterDatanodeProtocol, metadata
ClientDatanodeProtocol operation
ClientProtocol s and
recovery

Sunday 29 May 2016 Hadoop - HDFS Practise Session 4


Daemon Ports, Protocols & Parameters
Daemon Defaul Configuration Protocol Used for
t Port Parameter

Backupnode 50100 dfs.backup.add Same as namenode HDFS


ress Metadata
Operations

Jobtracker Default is mapred.job.trac IPC: Job


not well-
defined. ker JobSubmissionProtocol submission,
Common , InterTrackerProtocol task tracker
values
are 8021,
heartbeats.
9001, or
8012

Tasktracker 127.0. mapred.task.tra IPC: Communicat


0.1:0 cker.report.addr TaskUmbilicalProtocol ing with
(Binds to an ess child jobs
unused local
port )
Sunday 29 May 2016 Hadoop - HDFS Practise Session 5
Web UI Ports & Parameters

Daemon Web UI Web UI Configuration Parameter


Port

Namenode 50070 dfs.http.address

H Datanodes 50075 dfs.datanode.http.address


D
F Secondarynamenode 50090 dfs.secondary.http.address
S
Backup/Checkpoint node 50105 dfs.backup.http.address

Jobracker 50030 mapred.job.tracker.http.address


M
R
Tasktrackers 50060 mapred.task.tracker.http.address

Sunday 29 May 2016 Hadoop - HDFS Practise Session 6


Thank You

Sunday 29 May 2016 Hadoop - HDFS Practise Session 7

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy