SlideShare a Scribd company logo
Hadoop 2.4 installing on ubuntu 14.04
akshath@baabte.com
facebook.com/akshath.kumar180
Twitter.com/akshath4u
in.linkedin.com/in/akshathkumar
HADOOP 2.4 INSTALLATION ON UBUNTU 14.04
Hadoop is a free, Java-based programming framework that supports the
processing of large data sets in a distributed computing environment. It is part
of the Apache project sponsored by the Apache Software Foundation.
In this presentation, we'll install a single-node Hadoop cluster backed by the Hadoop
Distributed File System on Ubuntu.
HADOOP ON 64 BIT UBUNTU 14.04
➢ INSTALLING JAVA
➢ ADDING A DEDICATED HADOOP USER
➢ INSTALLING SSH
➢ CREATE AND SETUP SSH CERTIFICATES
➢ INSTALL HADOOP
➢ SETUP CONFIGURATION FILES
➢ FORMAT THE NEW HADOOP FILESYSTEM
➢ STARTING HADOOP
➢ STOPPING HADOOP
➢ HADOOP WEB INTERFACES
➢ USING HADOOP
STEPS FOR INSTALLING HADOOP
❖ Hadoop framework is written in Java!
➢ Open Terminal in Ubuntu 14.04, and Following the Steps
INSTALLING JAVA
ADDING A DEDICATED HADOOP USER
➔ SSH has 2 main components :
◆ ssh : The command we use to connect to remote machines - the client.
◆ sshd : The daemon that is running on the server and allows clients to connect to
the server.
➔ The ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to install
ssh first. Use this command to do that.
$ sudo apt-get install ssh
INSTALLING SSH
➔ Hadoop requires SSH access to manage its nodes,we therefore need to configure SSH access to
localhost. So, we need to have SSH up and running on our machine and configured it to allow SSH
public key authentication.
CREATE AND SETUP SSH CERTIFICATES
➔ The second command adds the newly created key to the list of authorized keys
so that Hadoop can use ssh without prompting for a password.
◆ We can check if ssh works
CREATE AND SETUP SSH CERTIFICATES
➔ Type the following Command in Ubuntu Terminal.
➔ We want to move the Hadoop installation to the /usr/local/hadoop directory using
the following command
INSTALL HADOOP
$ wget http://mirrors.sonic.net/apache/hadoop/common/hadoop-2.4.1/hadoop-2.4.1.tar.gz
$ tar xvzf hadoop-2.4.1.tar.gz
$ sudo mv hadoop-2.4.1 /usr/local/hadoop
$ sudo chown -R hduser:hadoop hadoop
$ pwd
/usr/local/hadoop
$ ls
bin etc include lib libexec LICENSE.txt NOTICE.txt README.txt sbin
share
➔ The following files will have to be modified to complete the Hadoop
setup
1. ~/.bashrc
2. /usr/local/hadoop/etc/hadoop/hadoop-env.sh
3. /usr/local/hadoop/etc/hadoop/core-site.xml
4. /usr/local/hadoop/etc/hadoop/mapred-site.xml.template
5. /usr/local/hadoop/etc/hadoop/hdfs-site.xml
➔ First of all we need to find the path where Java has been installed to set the
JAVA_HOME environment variable using the following command
SETUP CONFIGURATION FILES
$ update-alternatives --config java
There is only one alternative in link group java (providing /usr/bin/java):
/usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java
Nothing to configure.
1. ~/.bashrc
We can append the following to the end of ~/.bashrc
SETUP CONFIGURATION FILES
2. /usr/local/hadoop/etc/hadoop/hadoop-env.sh
★ We need to set JAVA_HOME by modifying hadoop-env.sh file
export JAVA_HOME=/usr/lib/jvm/java-7-
openjdk-amd64
★ Adding the above statement in the hadoop-env.sh file ensures that the
value of JAVA_HOME variable will be available to Hadoop whenever it
is started up.
SETUP CONFIGURATION FILES
3. /usr/local/hadoop/etc/hadoop/core-site.xm
★ he /usr/local/hadoop/etc/hadoop/core-site.xml file contains configuration
properties that Hadoop uses when starting up. This file can be used to override
the default settings that Hadoop starts with.
$ sudo mkdir -p /app/hadoop/tmp
$ sudo chown hduser:hadoop /app/hadoop/tmp
SETUP CONFIGURATION FILES
Open the file and enter the following in between the <configuration>
</configuration> tag.
SETUP CONFIGURATION FILES
4. /usr/local/hadoop/etc/hadoop/mapred-site.xml
➔ By default, the /usr/local/hadoop/etc/hadoop/ folder contains the
/usr/local/hadoop/etc/hadoop/mapred-site.xml.template file which has to be
renamed/copied with the name mapred-site.xml
$ cp/usr/local/hadoop/etc/hadoop/mapred-site.xml.template
/usr/local/hadoop/etc/hadoop/mapred-site.xml
SETUP CONFIGURATION FILES
➔ The mapred-site.xml file is used to specify which framework is being used for
MapReduce. We need to enter the following content in between the
<configuration></configuration> tag
SETUP CONFIGURATION FILES
5. /usr/local/hadoop/etc/hadoop/hdfs-site.xml
➔ Before editing this file, we need to create two directories which will contain the namenode and the
datanode for this Hadoop installation. This can be done using the following commands
SETUP CONFIGURATION FILES
➔ Open the file and enter the following content in between the <configuration></configuration> tag
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
<description>Default block replication.
The actual number of replications can be specified when the file is created.
The default is used if replication is not specified in create time.
</description>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/usr/local/hadoop_store/hdfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/usr/local/hadoop_store/hdfs/datanode</value>
</property>
</configuration>
SETUP CONFIGURATION FILES
➔ Now, the Hadoop filesystem needs to be formatted so that we can start to use it.
The format command should be issued with write permission since it creates
current directory under /usr/local/hadoop_store/hdfs/namenode folder.
hduser@k:~$ hadoop namenode -format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
…………………………………………………………………………. …………. ………………….. ……………………. ………..
………… …………………. …………………… …………………… ……………………………………. ………………. ………..
14/07/13 22:13:10 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at k/127.0.1.1
************************************************************/
FORMAT THE NEW HADOOP FILESYSTEM
The hadoop namenode -format command should be executed once before we start using Hadoop. If this command is
executed again after Hadoop has been used, it'll destroy all the data on the Hadoop file system.
➔ Now it's time to start the newly installed single node cluster. We can use start-
all.sh or (start-dfs.sh and start-yarn.sh)
hduser@k:/home/k$ start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
14/07/13 23:36:59 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
………………. ………………….. ……………….. . ……. ………………………………….. ……………………
……………………………. ……………………………………………………………….. ……………………………….. ..
localhost: starting nodemanager,
logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-k.out
➔ check if it's really up and running.
hduser@k:/home/k$ jps
➔ Another way to check is using netstat
hduser@k:/home/k$ netstat -plten | grep java
STARTING
HADOOP
➔ Type the following Command into the Terminal for Stopping
the Hadoop
$ pwd
/usr/local/hadoop/sbin
$ ls
distribute-exclude.sh httpfs.sh start-all.sh
----------------------- ------------- ---------- -----------
------------ -- -- - -- -------- - -- - - -- - - - - - ----
start-secure-dns.sh stop-balancer.sh stop-yarn.sh
➔ We run stop-all.sh or (stop-dfs.sh and stop-yarn.sh) to stop all the daemons
running on our machine:
$ /usr/local/hadoop/sbin/stop-all.sh
STOPPING HADOOP
Thank You...
US UK UAE
7002 Hana Road,
Edison NJ 08817,
United States of America.
90 High Street,
Cherry Hinton,
Cambridge, CB1 9HZ,
United Kingdom.
Suite No: 51, Oasis Center,
Sheikh Zayed Road, Dubai,
UAE
Email to info@baabtra.com or Visit baabtra.com
Looking for learning more about the above
topic?
India Centres
Emarald Mall (Big Bazar Building)
Mavoor Road, Kozhikode,
Kerala, India.
Ph: + 91 – 495 40 25 550
NC Complex, Near Bus Stand
Mukkam, Kozhikode,
Kerala, India.
Ph: + 91 – 495 40 25 550
Cafit Square IT Park,
Hilite Business Park,
Kozhikode
Kerala, India.
Email: info@baabtra.com
TBI - NITC
NIT Campus, Kozhikode.
Kerala, India.
Start up Village
Eranakulam,
Kerala, India.
Start up Village
UL CC
Kozhikode, Kerala
Follow us @ twitter.com/baabtra
Like us @ facebook.com/baabtra
Subscribe to us @ youtube.com/baabtra
Become a follower @ slideshare.net/BaabtraMentoringPartner
Connect to us @ in.linkedin.com/in/baabtra
Give a feedback @ massbaab.com/baabtra
Thanks in advance
www.baabtra.com | www.massbaab.com |www.baabte.com
Want to learn more about programming or Looking to become a good programmer?
Are you wasting time on searching so many contents online?
Do you want to learn things quickly?
Tired of spending huge amount of money to become a Software professional?
Do an online course
@ baabtra.com
We put industry standards to practice. Our structured, activity based courses are so designed
to make a quick, good software professional out of anybody who holds a passion for coding.
Ad

Recommended

Hadoop installation
Hadoop installation
habeebulla g
 
Hadoop installation on windows
Hadoop installation on windows
habeebulla g
 
Hadoop single node setup
Hadoop single node setup
Mohammad_Tariq
 
An example Hadoop Install
An example Hadoop Install
Mike Frampton
 
Hadoop installation
Hadoop installation
Ankit Desai
 
Single node hadoop cluster installation
Single node hadoop cluster installation
Mahantesh Angadi
 
Hadoop 2.0 cluster setup on ubuntu 14.04 (64 bit)
Hadoop 2.0 cluster setup on ubuntu 14.04 (64 bit)
Nag Arvind Gudiseva
 
Hadoop single cluster installation
Hadoop single cluster installation
Minh Tran
 
Apache Hadoop & Hive installation with movie rating exercise
Apache Hadoop & Hive installation with movie rating exercise
Shiva Rama Krishna Dasharathi
 
Running hadoop on ubuntu linux
Running hadoop on ubuntu linux
TRCK
 
Hadoop Installation
Hadoop Installation
mrinalsingh385
 
Hadoop Installation and basic configuration
Hadoop Installation and basic configuration
Gerrit van Vuuren
 
Drupal from scratch
Drupal from scratch
Rovic Honrado
 
Open Source Backup Conference 2014: Workshop bareos introduction, by Philipp ...
Open Source Backup Conference 2014: Workshop bareos introduction, by Philipp ...
NETWAYS
 
HADOOP 실제 구성 사례, Multi-Node 구성
HADOOP 실제 구성 사례, Multi-Node 구성
Young Pyo
 
Set up Hadoop Cluster on Amazon EC2
Set up Hadoop Cluster on Amazon EC2
IMC Institute
 
Hadoop 20111215
Hadoop 20111215
exsuns
 
Hadoop spark performance comparison
Hadoop spark performance comparison
arunkumar sadhasivam
 
Hadoop meet Rex(How to construct hadoop cluster with rex)
Hadoop meet Rex(How to construct hadoop cluster with rex)
Jun Hong Kim
 
Hadoop 20111117
Hadoop 20111117
exsuns
 
Out of the Box Replication in Postgres 9.4(pgconfsf)
Out of the Box Replication in Postgres 9.4(pgconfsf)
Denish Patel
 
Install hadoop in a cluster
Install hadoop in a cluster
Xuhong Zhang
 
250hadoopinterviewquestions
250hadoopinterviewquestions
Ramana Swamy
 
Hadoop HDFS
Hadoop HDFS
Madhur Nawandar
 
Empacotamento e backport de aplicações em debian
Empacotamento e backport de aplicações em debian
Andre Ferraz
 
Hadoop installation and Running KMeans Clustering with MapReduce Program on H...
Hadoop installation and Running KMeans Clustering with MapReduce Program on H...
Titus Damaiyanti
 
Run wordcount job (hadoop)
Run wordcount job (hadoop)
valeri kopaleishvili
 
Hadoop single node installation on ubuntu 14
Hadoop single node installation on ubuntu 14
jijukjoseph
 
02 Hadoop deployment and configuration
02 Hadoop deployment and configuration
Subhas Kumar Ghosh
 
R hive tutorial supplement 1 - Installing Hadoop
R hive tutorial supplement 1 - Installing Hadoop
Aiden Seonghak Hong
 

More Related Content

What's hot (17)

Apache Hadoop & Hive installation with movie rating exercise
Apache Hadoop & Hive installation with movie rating exercise
Shiva Rama Krishna Dasharathi
 
Running hadoop on ubuntu linux
Running hadoop on ubuntu linux
TRCK
 
Hadoop Installation
Hadoop Installation
mrinalsingh385
 
Hadoop Installation and basic configuration
Hadoop Installation and basic configuration
Gerrit van Vuuren
 
Drupal from scratch
Drupal from scratch
Rovic Honrado
 
Open Source Backup Conference 2014: Workshop bareos introduction, by Philipp ...
Open Source Backup Conference 2014: Workshop bareos introduction, by Philipp ...
NETWAYS
 
HADOOP 실제 구성 사례, Multi-Node 구성
HADOOP 실제 구성 사례, Multi-Node 구성
Young Pyo
 
Set up Hadoop Cluster on Amazon EC2
Set up Hadoop Cluster on Amazon EC2
IMC Institute
 
Hadoop 20111215
Hadoop 20111215
exsuns
 
Hadoop spark performance comparison
Hadoop spark performance comparison
arunkumar sadhasivam
 
Hadoop meet Rex(How to construct hadoop cluster with rex)
Hadoop meet Rex(How to construct hadoop cluster with rex)
Jun Hong Kim
 
Hadoop 20111117
Hadoop 20111117
exsuns
 
Out of the Box Replication in Postgres 9.4(pgconfsf)
Out of the Box Replication in Postgres 9.4(pgconfsf)
Denish Patel
 
Install hadoop in a cluster
Install hadoop in a cluster
Xuhong Zhang
 
250hadoopinterviewquestions
250hadoopinterviewquestions
Ramana Swamy
 
Hadoop HDFS
Hadoop HDFS
Madhur Nawandar
 
Empacotamento e backport de aplicações em debian
Empacotamento e backport de aplicações em debian
Andre Ferraz
 
Apache Hadoop & Hive installation with movie rating exercise
Apache Hadoop & Hive installation with movie rating exercise
Shiva Rama Krishna Dasharathi
 
Running hadoop on ubuntu linux
Running hadoop on ubuntu linux
TRCK
 
Hadoop Installation and basic configuration
Hadoop Installation and basic configuration
Gerrit van Vuuren
 
Open Source Backup Conference 2014: Workshop bareos introduction, by Philipp ...
Open Source Backup Conference 2014: Workshop bareos introduction, by Philipp ...
NETWAYS
 
HADOOP 실제 구성 사례, Multi-Node 구성
HADOOP 실제 구성 사례, Multi-Node 구성
Young Pyo
 
Set up Hadoop Cluster on Amazon EC2
Set up Hadoop Cluster on Amazon EC2
IMC Institute
 
Hadoop 20111215
Hadoop 20111215
exsuns
 
Hadoop spark performance comparison
Hadoop spark performance comparison
arunkumar sadhasivam
 
Hadoop meet Rex(How to construct hadoop cluster with rex)
Hadoop meet Rex(How to construct hadoop cluster with rex)
Jun Hong Kim
 
Hadoop 20111117
Hadoop 20111117
exsuns
 
Out of the Box Replication in Postgres 9.4(pgconfsf)
Out of the Box Replication in Postgres 9.4(pgconfsf)
Denish Patel
 
Install hadoop in a cluster
Install hadoop in a cluster
Xuhong Zhang
 
250hadoopinterviewquestions
250hadoopinterviewquestions
Ramana Swamy
 
Empacotamento e backport de aplicações em debian
Empacotamento e backport de aplicações em debian
Andre Ferraz
 

Similar to Hadoop 2.4 installing on ubuntu 14.04 (20)

Hadoop installation and Running KMeans Clustering with MapReduce Program on H...
Hadoop installation and Running KMeans Clustering with MapReduce Program on H...
Titus Damaiyanti
 
Run wordcount job (hadoop)
Run wordcount job (hadoop)
valeri kopaleishvili
 
Hadoop single node installation on ubuntu 14
Hadoop single node installation on ubuntu 14
jijukjoseph
 
02 Hadoop deployment and configuration
02 Hadoop deployment and configuration
Subhas Kumar Ghosh
 
R hive tutorial supplement 1 - Installing Hadoop
R hive tutorial supplement 1 - Installing Hadoop
Aiden Seonghak Hong
 
Hadoop completereference
Hadoop completereference
arunkumar sadhasivam
 
Setting up a HADOOP 2.2 cluster on CentOS 6
Setting up a HADOOP 2.2 cluster on CentOS 6
Manish Chopra
 
Setup and run hadoop distrubution file system example 2.2
Setup and run hadoop distrubution file system example 2.2
Mounir Benhalla
 
Configure h base hadoop and hbase client
Configure h base hadoop and hbase client
Shashwat Shriparv
 
Big data with hadoop Setup on Ubuntu 12.04
Big data with hadoop Setup on Ubuntu 12.04
Mandakini Kumari
 
Exp-3.pptx
Exp-3.pptx
PraveenKumar581409
 
Configuring Your First Hadoop Cluster On EC2
Configuring Your First Hadoop Cluster On EC2
benjaminwootton
 
July 2010 Triangle Hadoop Users Group - Chad Vawter Slides
July 2010 Triangle Hadoop Users Group - Chad Vawter Slides
ryancox
 
Deploy hadoop cluster
Deploy hadoop cluster
Chirag Ahuja
 
Single node setup
Single node setup
KBCHOW123
 
Hadoop installation steps
Hadoop installation steps
Mayank Sharma
 
Apache Hadoop Shell Rewrite
Apache Hadoop Shell Rewrite
Allen Wittenauer
 
Hadoop operations basic
Hadoop operations basic
Hafizur Rahman
 
Hadoop installation with an example
Hadoop installation with an example
Nikita Kesharwani
 
Configuring and manipulating HDFS files
Configuring and manipulating HDFS files
Rupak Roy
 
Hadoop installation and Running KMeans Clustering with MapReduce Program on H...
Hadoop installation and Running KMeans Clustering with MapReduce Program on H...
Titus Damaiyanti
 
Hadoop single node installation on ubuntu 14
Hadoop single node installation on ubuntu 14
jijukjoseph
 
02 Hadoop deployment and configuration
02 Hadoop deployment and configuration
Subhas Kumar Ghosh
 
R hive tutorial supplement 1 - Installing Hadoop
R hive tutorial supplement 1 - Installing Hadoop
Aiden Seonghak Hong
 
Setting up a HADOOP 2.2 cluster on CentOS 6
Setting up a HADOOP 2.2 cluster on CentOS 6
Manish Chopra
 
Setup and run hadoop distrubution file system example 2.2
Setup and run hadoop distrubution file system example 2.2
Mounir Benhalla
 
Configure h base hadoop and hbase client
Configure h base hadoop and hbase client
Shashwat Shriparv
 
Big data with hadoop Setup on Ubuntu 12.04
Big data with hadoop Setup on Ubuntu 12.04
Mandakini Kumari
 
Configuring Your First Hadoop Cluster On EC2
Configuring Your First Hadoop Cluster On EC2
benjaminwootton
 
July 2010 Triangle Hadoop Users Group - Chad Vawter Slides
July 2010 Triangle Hadoop Users Group - Chad Vawter Slides
ryancox
 
Deploy hadoop cluster
Deploy hadoop cluster
Chirag Ahuja
 
Single node setup
Single node setup
KBCHOW123
 
Hadoop installation steps
Hadoop installation steps
Mayank Sharma
 
Apache Hadoop Shell Rewrite
Apache Hadoop Shell Rewrite
Allen Wittenauer
 
Hadoop operations basic
Hadoop operations basic
Hafizur Rahman
 
Hadoop installation with an example
Hadoop installation with an example
Nikita Kesharwani
 
Configuring and manipulating HDFS files
Configuring and manipulating HDFS files
Rupak Roy
 
Ad

More from baabtra.com - No. 1 supplier of quality freshers (20)

Agile methodology and scrum development
Agile methodology and scrum development
baabtra.com - No. 1 supplier of quality freshers
 
Best coding practices
Best coding practices
baabtra.com - No. 1 supplier of quality freshers
 
Core java - baabtra
Core java - baabtra
baabtra.com - No. 1 supplier of quality freshers
 
Acquiring new skills what you should know
Acquiring new skills what you should know
baabtra.com - No. 1 supplier of quality freshers
 
Baabtra.com programming at school
Baabtra.com programming at school
baabtra.com - No. 1 supplier of quality freshers
 
99LMS for Enterprises - LMS that you will love
99LMS for Enterprises - LMS that you will love
baabtra.com - No. 1 supplier of quality freshers
 
Php sessions & cookies
Php sessions & cookies
baabtra.com - No. 1 supplier of quality freshers
 
Php database connectivity
Php database connectivity
baabtra.com - No. 1 supplier of quality freshers
 
Chapter 6 database normalisation
Chapter 6 database normalisation
baabtra.com - No. 1 supplier of quality freshers
 
Chapter 5 transactions and dcl statements
Chapter 5 transactions and dcl statements
baabtra.com - No. 1 supplier of quality freshers
 
Chapter 4 functions, views, indexing
Chapter 4 functions, views, indexing
baabtra.com - No. 1 supplier of quality freshers
 
Chapter 3 stored procedures
Chapter 3 stored procedures
baabtra.com - No. 1 supplier of quality freshers
 
Chapter 2 grouping,scalar and aggergate functions,joins inner join,outer join
Chapter 2 grouping,scalar and aggergate functions,joins inner join,outer join
baabtra.com - No. 1 supplier of quality freshers
 
Chapter 1 introduction to sql server
Chapter 1 introduction to sql server
baabtra.com - No. 1 supplier of quality freshers
 
Chapter 1 introduction to sql server
Chapter 1 introduction to sql server
baabtra.com - No. 1 supplier of quality freshers
 
Microsoft holo lens
Microsoft holo lens
baabtra.com - No. 1 supplier of quality freshers
 
Blue brain
Blue brain
baabtra.com - No. 1 supplier of quality freshers
 
5g
5g
baabtra.com - No. 1 supplier of quality freshers
 
Aptitude skills baabtra
Aptitude skills baabtra
baabtra.com - No. 1 supplier of quality freshers
 
Gd baabtra
Gd baabtra
baabtra.com - No. 1 supplier of quality freshers
 
Ad

Recently uploaded (20)

OpenACC and Open Hackathons Monthly Highlights June 2025
OpenACC and Open Hackathons Monthly Highlights June 2025
OpenACC
 
Salesforce Summer '25 Release Frenchgathering.pptx.pdf
Salesforce Summer '25 Release Frenchgathering.pptx.pdf
yosra Saidani
 
PyCon SG 25 - Firecracker Made Easy with Python.pdf
PyCon SG 25 - Firecracker Made Easy with Python.pdf
Muhammad Yuga Nugraha
 
OpenPOWER Foundation & Open-Source Core Innovations
OpenPOWER Foundation & Open-Source Core Innovations
IBM
 
"How to survive Black Friday: preparing e-commerce for a peak season", Yurii ...
"How to survive Black Friday: preparing e-commerce for a peak season", Yurii ...
Fwdays
 
Cyber Defense Matrix Workshop - RSA Conference
Cyber Defense Matrix Workshop - RSA Conference
Priyanka Aash
 
From Manual to Auto Searching- FME in the Driver's Seat
From Manual to Auto Searching- FME in the Driver's Seat
Safe Software
 
Tech-ASan: Two-stage check for Address Sanitizer - Yixuan Cao.pdf
Tech-ASan: Two-stage check for Address Sanitizer - Yixuan Cao.pdf
caoyixuan2019
 
UserCon Belgium: Honey, VMware increased my bill
UserCon Belgium: Honey, VMware increased my bill
stijn40
 
Curietech AI in action - Accelerate MuleSoft development
Curietech AI in action - Accelerate MuleSoft development
shyamraj55
 
9-1-1 Addressing: End-to-End Automation Using FME
9-1-1 Addressing: End-to-End Automation Using FME
Safe Software
 
EIS-Webinar-Engineering-Retail-Infrastructure-06-16-2025.pdf
EIS-Webinar-Engineering-Retail-Infrastructure-06-16-2025.pdf
Earley Information Science
 
OWASP Barcelona 2025 Threat Model Library
OWASP Barcelona 2025 Threat Model Library
PetraVukmirovic
 
ReSTIR [DI]: Spatiotemporal reservoir resampling for real-time ray tracing ...
ReSTIR [DI]: Spatiotemporal reservoir resampling for real-time ray tracing ...
revolcs10
 
The Future of Product Management in AI ERA.pdf
The Future of Product Management in AI ERA.pdf
Alyona Owens
 
"Scaling in space and time with Temporal", Andriy Lupa.pdf
"Scaling in space and time with Temporal", Andriy Lupa.pdf
Fwdays
 
CapCut Pro Crack For PC Latest Version {Fully Unlocked} 2025
CapCut Pro Crack For PC Latest Version {Fully Unlocked} 2025
pcprocore
 
Connecting Data and Intelligence: The Role of FME in Machine Learning
Connecting Data and Intelligence: The Role of FME in Machine Learning
Safe Software
 
cnc-processing-centers-centateq-p-110-en.pdf
cnc-processing-centers-centateq-p-110-en.pdf
AmirStern2
 
Quantum AI: Where Impossible Becomes Probable
Quantum AI: Where Impossible Becomes Probable
Saikat Basu
 
OpenACC and Open Hackathons Monthly Highlights June 2025
OpenACC and Open Hackathons Monthly Highlights June 2025
OpenACC
 
Salesforce Summer '25 Release Frenchgathering.pptx.pdf
Salesforce Summer '25 Release Frenchgathering.pptx.pdf
yosra Saidani
 
PyCon SG 25 - Firecracker Made Easy with Python.pdf
PyCon SG 25 - Firecracker Made Easy with Python.pdf
Muhammad Yuga Nugraha
 
OpenPOWER Foundation & Open-Source Core Innovations
OpenPOWER Foundation & Open-Source Core Innovations
IBM
 
"How to survive Black Friday: preparing e-commerce for a peak season", Yurii ...
"How to survive Black Friday: preparing e-commerce for a peak season", Yurii ...
Fwdays
 
Cyber Defense Matrix Workshop - RSA Conference
Cyber Defense Matrix Workshop - RSA Conference
Priyanka Aash
 
From Manual to Auto Searching- FME in the Driver's Seat
From Manual to Auto Searching- FME in the Driver's Seat
Safe Software
 
Tech-ASan: Two-stage check for Address Sanitizer - Yixuan Cao.pdf
Tech-ASan: Two-stage check for Address Sanitizer - Yixuan Cao.pdf
caoyixuan2019
 
UserCon Belgium: Honey, VMware increased my bill
UserCon Belgium: Honey, VMware increased my bill
stijn40
 
Curietech AI in action - Accelerate MuleSoft development
Curietech AI in action - Accelerate MuleSoft development
shyamraj55
 
9-1-1 Addressing: End-to-End Automation Using FME
9-1-1 Addressing: End-to-End Automation Using FME
Safe Software
 
EIS-Webinar-Engineering-Retail-Infrastructure-06-16-2025.pdf
EIS-Webinar-Engineering-Retail-Infrastructure-06-16-2025.pdf
Earley Information Science
 
OWASP Barcelona 2025 Threat Model Library
OWASP Barcelona 2025 Threat Model Library
PetraVukmirovic
 
ReSTIR [DI]: Spatiotemporal reservoir resampling for real-time ray tracing ...
ReSTIR [DI]: Spatiotemporal reservoir resampling for real-time ray tracing ...
revolcs10
 
The Future of Product Management in AI ERA.pdf
The Future of Product Management in AI ERA.pdf
Alyona Owens
 
"Scaling in space and time with Temporal", Andriy Lupa.pdf
"Scaling in space and time with Temporal", Andriy Lupa.pdf
Fwdays
 
CapCut Pro Crack For PC Latest Version {Fully Unlocked} 2025
CapCut Pro Crack For PC Latest Version {Fully Unlocked} 2025
pcprocore
 
Connecting Data and Intelligence: The Role of FME in Machine Learning
Connecting Data and Intelligence: The Role of FME in Machine Learning
Safe Software
 
cnc-processing-centers-centateq-p-110-en.pdf
cnc-processing-centers-centateq-p-110-en.pdf
AmirStern2
 
Quantum AI: Where Impossible Becomes Probable
Quantum AI: Where Impossible Becomes Probable
Saikat Basu
 

Hadoop 2.4 installing on ubuntu 14.04

  • 3. Hadoop is a free, Java-based programming framework that supports the processing of large data sets in a distributed computing environment. It is part of the Apache project sponsored by the Apache Software Foundation. In this presentation, we'll install a single-node Hadoop cluster backed by the Hadoop Distributed File System on Ubuntu. HADOOP ON 64 BIT UBUNTU 14.04
  • 4. ➢ INSTALLING JAVA ➢ ADDING A DEDICATED HADOOP USER ➢ INSTALLING SSH ➢ CREATE AND SETUP SSH CERTIFICATES ➢ INSTALL HADOOP ➢ SETUP CONFIGURATION FILES ➢ FORMAT THE NEW HADOOP FILESYSTEM ➢ STARTING HADOOP ➢ STOPPING HADOOP ➢ HADOOP WEB INTERFACES ➢ USING HADOOP STEPS FOR INSTALLING HADOOP
  • 5. ❖ Hadoop framework is written in Java! ➢ Open Terminal in Ubuntu 14.04, and Following the Steps INSTALLING JAVA
  • 6. ADDING A DEDICATED HADOOP USER
  • 7. ➔ SSH has 2 main components : ◆ ssh : The command we use to connect to remote machines - the client. ◆ sshd : The daemon that is running on the server and allows clients to connect to the server. ➔ The ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to install ssh first. Use this command to do that. $ sudo apt-get install ssh INSTALLING SSH
  • 8. ➔ Hadoop requires SSH access to manage its nodes,we therefore need to configure SSH access to localhost. So, we need to have SSH up and running on our machine and configured it to allow SSH public key authentication. CREATE AND SETUP SSH CERTIFICATES
  • 9. ➔ The second command adds the newly created key to the list of authorized keys so that Hadoop can use ssh without prompting for a password. ◆ We can check if ssh works CREATE AND SETUP SSH CERTIFICATES
  • 10. ➔ Type the following Command in Ubuntu Terminal. ➔ We want to move the Hadoop installation to the /usr/local/hadoop directory using the following command INSTALL HADOOP $ wget http://mirrors.sonic.net/apache/hadoop/common/hadoop-2.4.1/hadoop-2.4.1.tar.gz $ tar xvzf hadoop-2.4.1.tar.gz $ sudo mv hadoop-2.4.1 /usr/local/hadoop $ sudo chown -R hduser:hadoop hadoop $ pwd /usr/local/hadoop $ ls bin etc include lib libexec LICENSE.txt NOTICE.txt README.txt sbin share
  • 11. ➔ The following files will have to be modified to complete the Hadoop setup 1. ~/.bashrc 2. /usr/local/hadoop/etc/hadoop/hadoop-env.sh 3. /usr/local/hadoop/etc/hadoop/core-site.xml 4. /usr/local/hadoop/etc/hadoop/mapred-site.xml.template 5. /usr/local/hadoop/etc/hadoop/hdfs-site.xml ➔ First of all we need to find the path where Java has been installed to set the JAVA_HOME environment variable using the following command SETUP CONFIGURATION FILES $ update-alternatives --config java There is only one alternative in link group java (providing /usr/bin/java): /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java Nothing to configure.
  • 12. 1. ~/.bashrc We can append the following to the end of ~/.bashrc SETUP CONFIGURATION FILES
  • 13. 2. /usr/local/hadoop/etc/hadoop/hadoop-env.sh ★ We need to set JAVA_HOME by modifying hadoop-env.sh file export JAVA_HOME=/usr/lib/jvm/java-7- openjdk-amd64 ★ Adding the above statement in the hadoop-env.sh file ensures that the value of JAVA_HOME variable will be available to Hadoop whenever it is started up. SETUP CONFIGURATION FILES
  • 14. 3. /usr/local/hadoop/etc/hadoop/core-site.xm ★ he /usr/local/hadoop/etc/hadoop/core-site.xml file contains configuration properties that Hadoop uses when starting up. This file can be used to override the default settings that Hadoop starts with. $ sudo mkdir -p /app/hadoop/tmp $ sudo chown hduser:hadoop /app/hadoop/tmp SETUP CONFIGURATION FILES
  • 15. Open the file and enter the following in between the <configuration> </configuration> tag. SETUP CONFIGURATION FILES
  • 16. 4. /usr/local/hadoop/etc/hadoop/mapred-site.xml ➔ By default, the /usr/local/hadoop/etc/hadoop/ folder contains the /usr/local/hadoop/etc/hadoop/mapred-site.xml.template file which has to be renamed/copied with the name mapred-site.xml $ cp/usr/local/hadoop/etc/hadoop/mapred-site.xml.template /usr/local/hadoop/etc/hadoop/mapred-site.xml SETUP CONFIGURATION FILES
  • 17. ➔ The mapred-site.xml file is used to specify which framework is being used for MapReduce. We need to enter the following content in between the <configuration></configuration> tag SETUP CONFIGURATION FILES
  • 18. 5. /usr/local/hadoop/etc/hadoop/hdfs-site.xml ➔ Before editing this file, we need to create two directories which will contain the namenode and the datanode for this Hadoop installation. This can be done using the following commands SETUP CONFIGURATION FILES
  • 19. ➔ Open the file and enter the following content in between the <configuration></configuration> tag <configuration> <property> <name>dfs.replication</name> <value>1</value> <description>Default block replication. The actual number of replications can be specified when the file is created. The default is used if replication is not specified in create time. </description> </property> <property> <name>dfs.namenode.name.dir</name> <value>file:/usr/local/hadoop_store/hdfs/namenode</value> </property> <property> <name>dfs.datanode.data.dir</name> <value>file:/usr/local/hadoop_store/hdfs/datanode</value> </property> </configuration> SETUP CONFIGURATION FILES
  • 20. ➔ Now, the Hadoop filesystem needs to be formatted so that we can start to use it. The format command should be issued with write permission since it creates current directory under /usr/local/hadoop_store/hdfs/namenode folder. hduser@k:~$ hadoop namenode -format DEPRECATED: Use of this script to execute hdfs command is deprecated. Instead use the hdfs command for it. …………………………………………………………………………. …………. ………………….. ……………………. ……….. ………… …………………. …………………… …………………… ……………………………………. ………………. ……….. 14/07/13 22:13:10 INFO namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at k/127.0.1.1 ************************************************************/ FORMAT THE NEW HADOOP FILESYSTEM The hadoop namenode -format command should be executed once before we start using Hadoop. If this command is executed again after Hadoop has been used, it'll destroy all the data on the Hadoop file system.
  • 21. ➔ Now it's time to start the newly installed single node cluster. We can use start- all.sh or (start-dfs.sh and start-yarn.sh) hduser@k:/home/k$ start-all.sh This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh 14/07/13 23:36:59 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Starting namenodes on [localhost] ………………. ………………….. ……………….. . ……. ………………………………….. …………………… ……………………………. ……………………………………………………………….. ……………………………….. .. localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-k.out ➔ check if it's really up and running. hduser@k:/home/k$ jps ➔ Another way to check is using netstat hduser@k:/home/k$ netstat -plten | grep java STARTING HADOOP
  • 22. ➔ Type the following Command into the Terminal for Stopping the Hadoop $ pwd /usr/local/hadoop/sbin $ ls distribute-exclude.sh httpfs.sh start-all.sh ----------------------- ------------- ---------- ----------- ------------ -- -- - -- -------- - -- - - -- - - - - - ---- start-secure-dns.sh stop-balancer.sh stop-yarn.sh ➔ We run stop-all.sh or (stop-dfs.sh and stop-yarn.sh) to stop all the daemons running on our machine: $ /usr/local/hadoop/sbin/stop-all.sh STOPPING HADOOP
  • 24. US UK UAE 7002 Hana Road, Edison NJ 08817, United States of America. 90 High Street, Cherry Hinton, Cambridge, CB1 9HZ, United Kingdom. Suite No: 51, Oasis Center, Sheikh Zayed Road, Dubai, UAE Email to [email protected] or Visit baabtra.com Looking for learning more about the above topic?
  • 25. India Centres Emarald Mall (Big Bazar Building) Mavoor Road, Kozhikode, Kerala, India. Ph: + 91 – 495 40 25 550 NC Complex, Near Bus Stand Mukkam, Kozhikode, Kerala, India. Ph: + 91 – 495 40 25 550 Cafit Square IT Park, Hilite Business Park, Kozhikode Kerala, India. Email: [email protected] TBI - NITC NIT Campus, Kozhikode. Kerala, India. Start up Village Eranakulam, Kerala, India. Start up Village UL CC Kozhikode, Kerala
  • 26. Follow us @ twitter.com/baabtra Like us @ facebook.com/baabtra Subscribe to us @ youtube.com/baabtra Become a follower @ slideshare.net/BaabtraMentoringPartner Connect to us @ in.linkedin.com/in/baabtra Give a feedback @ massbaab.com/baabtra Thanks in advance www.baabtra.com | www.massbaab.com |www.baabte.com
  • 27. Want to learn more about programming or Looking to become a good programmer? Are you wasting time on searching so many contents online? Do you want to learn things quickly? Tired of spending huge amount of money to become a Software professional? Do an online course @ baabtra.com We put industry standards to practice. Our structured, activity based courses are so designed to make a quick, good software professional out of anybody who holds a passion for coding.