0% found this document useful (0 votes)
24 views

Big Data - ASSIGNMENT 3

The document provides instructions for performing common operations in HDFS: 1) It describes how to format and start HDFS using commands like hadoop namenode -format and start-dfs.sh. 2) It explains how to list files in HDFS using the hadoop fs -ls command and insert data into HDFS using commands like hadoop fs -mkdir, -put, and -ls. 3) Retrieving data from HDFS is demonstrated using commands like hadoop fs -cat to view files and -get to retrieve files locally. 4) Finally, shutting down HDFS is shown as stop-dfs.sh.

Uploaded by

DHARSHANA C P
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

Big Data - ASSIGNMENT 3

The document provides instructions for performing common operations in HDFS: 1) It describes how to format and start HDFS using commands like hadoop namenode -format and start-dfs.sh. 2) It explains how to list files in HDFS using the hadoop fs -ls command and insert data into HDFS using commands like hadoop fs -mkdir, -put, and -ls. 3) Retrieving data from HDFS is demonstrated using commands like hadoop fs -cat to view files and -get to retrieve files locally. 4) Finally, shutting down HDFS is shown as stop-dfs.sh.

Uploaded by

DHARSHANA C P
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Assignment - 3

Question 3: Show practical example to list files, Insert data, retrieving data and
shutting down HDFS.

Initially, you have to format the configured HDFS file system, open namenode
(HDFS server), and execute the following command.

$ hadoop namenode -format


After formatting the HDFS, start the distributed file system. The following
command will start the namenode as well as the data nodes as cluster.

$ start-dfs.sh

Listing Files in HDFS


After loading the information in the server, we can find the list of files in a directory,
status of a file, using ‘ls’. Given below is the syntax of ls that you can pass to a
directory or a filename as an argument.

$ $HADOOP_HOME/bin/hadoop fs -ls <args>

Inserting Data into HDFS


Assume we have data in the file called file.txt in the local system which is ought to
be saved in the hdfs file system. Follow the steps given below to insert the required
file in the Hadoop file system.

Step 1
You have to create an input directory.

$ $HADOOP_HOME/bin/hadoop fs -mkdir /user/input

Step 2
Transfer and store a data file from local systems to the Hadoop file system using
the put command.

$ $HADOOP_HOME/bin/hadoop fs -put /home/file.txt /user/input

Step 3
You can verify the file using ls command.

$ $HADOOP_HOME/bin/hadoop fs -ls /user/input


Retrieving Data from HDFS
Assume we have a file in HDFS called outfile. Given below is a simple
demonstration for retrieving the required file from the Hadoop file system.

Step 1
Initially, view the data from HDFS using cat command.

$ $HADOOP_HOME/bin/hadoop fs -cat /user/output/outfile

Step 2
Get the file from HDFS to the local file system using get command.

$ $HADOOP_HOME/bin/hadoop fs -get /user/output/ /home/hadoop_tp/

Shutting Down the HDFS


You can shut down the HDFS by using the following command.

$ stop-dfs.sh

You might also like