SlideShare a Scribd company logo
A GUIDE TO HADOOP
INSTALLATION
BY WizIQ
Hadoop Installation
• Putty connectivity
• Java Installation (openjdk-7-jdk)
• Group/User Creation &SSH Certification
• Hadoop Install (Hadoop 2.2.0)
• Hadoop configuration
• Hadoop Services
Putty Connectivity
User ununtu@<name>
 Make sure you have SSHAuth
browse to .ppk file(location)
Hadoop Installation
Java Installation
• ubuntu@ip-10-45-133-21:~$ sudo apt-get
install openjdk-7-jdk
• Error : Err http://us-east-
1.ec2.archive.ubuntu.com/ubuntu/ precise-
updates/main liblvm2app2.2 amd64
2.02.66-4ubuntu7.1
403 Forbidden] ……
• Solution: ubuntu@ip-10-45-133-21:~$
sudo apt-get update
ubuntu@ip-10-45-133-21:~$ sudo apt-get
install openjdk-7-jdk
ubuntu@ip-10-45-133-21:~$ cd /usr/lib/jvm
• Error:
ubuntu@ip-10-45-133-21:/usr/lib/jvm$ ln -s
java-7-openjdk-amd64 jdk
ln: failed to create symbolic link `jdk':
Permission denied
ubuntu@ip-10-45-133-21:/usr/lib/jvm$
• Solution:
ubuntu@ip-10-45-133-21:/usr/lib/jvm$sudo su
root@ip-10-45-133-21:/usr/lib/jvm# ln -s java-
7-openjdk-amd64 jdk
root@ip-10-45-133-21:/usr/lib/jvm# sudo apt-get install openssh-server
Group/User Creation &SSH Certification
• ubuntu@ip-10-45-133-21$ sudo addgroup
hadoop
• ubuntu@ip-10-45-133-21$ sudo adduser
--ingroup hadoop hduser
• ubuntu@ip-10-45-133-21$ sudo adduser
hduser sudo
**After user is created, re-login into ubuntu
using hduser
ubuntu@ip-10-45-133-21:~$ su –l hduser
Password: Setup SSH Certificate
hduser@ip-10-45-133-21$ ssh-keygen -t rsa -P ''
...
Your identification has been saved in
/home/hduser/.ssh/id_rsa.
Your public key has been saved in
/home/hduser/.ssh/id_rsa.pub.
...
hduser@ip-10-45-133-21$ cat ~/.ssh/id_rsa.pub
>> ~/.ssh/authorized_keys
hduser@ip-10-45-133-21$ ssh localhost
**
**Note: I have ignored the password here for %id_rsa:
Hadoop Install(Hadoop 2.2.0)
• hduser@ip-10-45-133-21:~# su -l hduser
• hduser@ip-10-45-133-21$ cd ~
• hduser@ip-10-45-133-21$ wget
http://www.trieuvan.com/apache/hadoop/c
ommon/hadoop- hduser@ip-10-45-133-
212.2.0/hadoop-2.2.0.tar.gz
• hduser@ip-10-45-133-21$ sudo tar vxzf
hadoop-2.2.0.tar.gz -C /usr/local
• hduser@ip-10-45-133-21$ cd /usr/local
• hduser@ip-10-45-133-21$ sudo mv hadoop-
2.2.0 hadoop
• hduser@ip-10-45-133-21$ sudo chown -R
hduser:hadoop hadoop
Hadoop Configuration
• Setup Hadoop Environment Variables:
• hduser@ip-10-45-133-21$cd ~
• hduser@ip-10-45-133-21$vi .bashrc
Copy & Paste following to the end of the file:
#Hadoop variables
export JAVA_HOME=/usr/lib/jvm/jdk/
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
###end of paste
hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop
hduser@ip-10-45-133-21$ vi hadoop-env.sh
#modify JAVA_HOME
export JAVA_HOME=/usr/lib/jvm/jdk/
****Re-login into Ubuntu using hdser and
check hadoop version******
hduser@ip-10-45-133-21$ hadoop version
Hadoop 2.2.0
Subversion
https://svn.apache.org/repos/asf/hadoop/
common -r 1529768
Compiled by hortonmu on 2013-10-
07T06:28Z
Compiled with protoc 2.5.0
From source with checksum
79e53ce7994d1628b240f09af91e1af4
This command was run using
/usr/local/hadoop-
2.2.0/share/hadoop/common/hadoop-
common-2.2.0.jar
At this point, Hadoop is installed.
Configure Hadoop :
hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop
hduser@ip-10-45-133-21$ vi core-site.xml
#Paste following between <configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
hduser@ip-10-45-133-21$ vi yarn-site.xml
#Paste following between <configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
hduser@ip-10-45-133-21$ mv mapred-site.xml.template mapred-site.xml
hduser@ip-10-45-133-21$ vi mapred-site.xml
• #Paste following between <configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
hduser@ip-10-45-133-21$ cd ~
hduser@ip-10-45-133-21$ mkdir -p mydata/hdfs/namenode
hduser@ip-10-45-133-21$ mkdir -p mydata/hdfs/datanode
hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop
hduser@ip-10-45-133-21$ vi hdfs-site.xml
• #Paste following between <configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/home/hduser/mydata/hdfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/home/hduser/mydata/hdfs/datanode</value>
</property>
Format Namenode :
hduser@ip-10-45-133-21$ hdfs namenode -format
Hadoop Services
hduser@ip-10-45-133-21$ start-dfs.sh
....
hduser@ip-10-45-133-21$ start-yarn.sh
....
Error:
Tried checking again:
1) SSH Certification creation
2) ssh localhost
3) vi hadoop-env.sh
4) vi hdfs-site.xml
5) Namenode and datanode directories
6) reformatted namenode
7) Started the services
That’s it after that it started working as below screens. Not sure what fixed the issue
as observed in next slide
• Error1: WARN util.NativeCodeLoader: Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable ….
• Error2: Localhost: Permission Denied (Publickey)…
Hadoop Installation
• hduser@ip-10-45-133-21:~$ jps
• All went on successfully and could see
below services running.
• Reconnected and works fine
THANK YOU!
Enroll now for Hadoop and Big Data Training @ WizIQ.com
Visit: http://www.wiziq.com/course/21308-hadoop-big-data-training
For more information, feel free to contact us at courses@wiziq.com

More Related Content

What's hot (20)

PDF
Hadoop 3.1.1 single node
康志強 大人
 
PPTX
HADOOP 실제 구성 사례, Multi-Node 구성
Young Pyo
 
PDF
Configuration Surgery with Augeas
Puppet
 
PPTX
How to create a secured multi tenancy for clustered ML with JupyterHub
Tiago Simões
 
PPTX
How to go the extra mile on monitoring
Tiago Simões
 
PDF
Setting up a HADOOP 2.2 cluster on CentOS 6
Manish Chopra
 
PDF
Light my-fuse
Workhorse Computing
 
PDF
Out of the box replication in postgres 9.4
Denish Patel
 
PDF
Puppet: Eclipsecon ALM 2013
grim_radical
 
PPTX
Hadoop 2.4 installing on ubuntu 14.04
baabtra.com - No. 1 supplier of quality freshers
 
PDF
Docker and friends at Linux Days 2014 in Prague
tomasbart
 
PDF
Putting some "logic" in LVM.
Workhorse Computing
 
PDF
Recipe of a linux Live CD (archived)
Bud Siddhisena
 
PDF
Introduction to Stacki at Atlanta Meetup February 2016
StackIQ
 
KEY
Making Your Capistrano Recipe Book
Tim Riley
 
PDF
A Journey to Boot Linux on Raspberry Pi
Jian-Hong Pan
 
DOCX
Ansible ex407 and EX 294
IkiArif1
 
PDF
Software Packaging for Cross OS Distribution
Jian-Hong Pan
 
PDF
Automated infrastructure is on the menu
jtimberman
 
PDF
[ETHCon Korea 2019] Shin mansun 신만선
ethconkr
 
Hadoop 3.1.1 single node
康志強 大人
 
HADOOP 실제 구성 사례, Multi-Node 구성
Young Pyo
 
Configuration Surgery with Augeas
Puppet
 
How to create a secured multi tenancy for clustered ML with JupyterHub
Tiago Simões
 
How to go the extra mile on monitoring
Tiago Simões
 
Setting up a HADOOP 2.2 cluster on CentOS 6
Manish Chopra
 
Light my-fuse
Workhorse Computing
 
Out of the box replication in postgres 9.4
Denish Patel
 
Puppet: Eclipsecon ALM 2013
grim_radical
 
Hadoop 2.4 installing on ubuntu 14.04
baabtra.com - No. 1 supplier of quality freshers
 
Docker and friends at Linux Days 2014 in Prague
tomasbart
 
Putting some "logic" in LVM.
Workhorse Computing
 
Recipe of a linux Live CD (archived)
Bud Siddhisena
 
Introduction to Stacki at Atlanta Meetup February 2016
StackIQ
 
Making Your Capistrano Recipe Book
Tim Riley
 
A Journey to Boot Linux on Raspberry Pi
Jian-Hong Pan
 
Ansible ex407 and EX 294
IkiArif1
 
Software Packaging for Cross OS Distribution
Jian-Hong Pan
 
Automated infrastructure is on the menu
jtimberman
 
[ETHCon Korea 2019] Shin mansun 신만선
ethconkr
 

Similar to Hadoop Installation (20)

PPTX
Hadoop installation on windows
habeebulla g
 
PDF
Hadoop single node installation on ubuntu 14
jijukjoseph
 
PDF
Hadoop completereference
arunkumar sadhasivam
 
PPTX
Session 03 - Hadoop Installation and Basic Commands
AnandMHadoop
 
PPTX
installation of hadoop on ubuntu.pptx
vishal choudhary
 
DOCX
Hadoop installation
habeebulla g
 
PDF
mapserver_install_linux
tutorialsruby
 
PDF
mapserver_install_linux
tutorialsruby
 
PDF
mapserver_install_linux
tutorialsruby
 
PDF
mapserver_install_linux
tutorialsruby
 
DOCX
Setup and run hadoop distrubution file system example 2.2
Mounir Benhalla
 
PDF
Hadoop 2.0 cluster setup on ubuntu 14.04 (64 bit)
Nag Arvind Gudiseva
 
PDF
02 Hadoop deployment and configuration
Subhas Kumar Ghosh
 
PPTX
Hadoop cluster 安裝
recast203
 
TXT
Hadoop installation
Leon Gladston
 
PPTX
Installing odoo v8 from github
Antony Gitomeh
 
PPTX
Installing GravCMS
George Sumpster
 
PPT
Edup wifi for raspberry zero
SoheilSabzevari2
 
PDF
ERP System Implementation Kubernetes Cluster with Sticky Sessions
Chanaka Lasantha
 
PPTX
Continuous delivery with docker
Johan Janssen
 
Hadoop installation on windows
habeebulla g
 
Hadoop single node installation on ubuntu 14
jijukjoseph
 
Hadoop completereference
arunkumar sadhasivam
 
Session 03 - Hadoop Installation and Basic Commands
AnandMHadoop
 
installation of hadoop on ubuntu.pptx
vishal choudhary
 
Hadoop installation
habeebulla g
 
mapserver_install_linux
tutorialsruby
 
mapserver_install_linux
tutorialsruby
 
mapserver_install_linux
tutorialsruby
 
mapserver_install_linux
tutorialsruby
 
Setup and run hadoop distrubution file system example 2.2
Mounir Benhalla
 
Hadoop 2.0 cluster setup on ubuntu 14.04 (64 bit)
Nag Arvind Gudiseva
 
02 Hadoop deployment and configuration
Subhas Kumar Ghosh
 
Hadoop cluster 安裝
recast203
 
Hadoop installation
Leon Gladston
 
Installing odoo v8 from github
Antony Gitomeh
 
Installing GravCMS
George Sumpster
 
Edup wifi for raspberry zero
SoheilSabzevari2
 
ERP System Implementation Kubernetes Cluster with Sticky Sessions
Chanaka Lasantha
 
Continuous delivery with docker
Johan Janssen
 
Ad

More from mrinalsingh385 (7)

PPT
OBIEE Interview Questions
mrinalsingh385
 
PPT
OBIEE 11G TRAINING PROGRAM
mrinalsingh385
 
PPT
Project management professional certification Tips
mrinalsingh385
 
PPT
Pmp new1
mrinalsingh385
 
PPT
Nine keys to successful delegation in Project Management
mrinalsingh385
 
PPT
ITIL FOUNDATION COURSE
mrinalsingh385
 
OBIEE Interview Questions
mrinalsingh385
 
OBIEE 11G TRAINING PROGRAM
mrinalsingh385
 
Project management professional certification Tips
mrinalsingh385
 
Pmp new1
mrinalsingh385
 
Nine keys to successful delegation in Project Management
mrinalsingh385
 
ITIL FOUNDATION COURSE
mrinalsingh385
 
Ad

Recently uploaded (20)

PPTX
HEALTH CARE DELIVERY SYSTEM - UNIT 2 - GNM 3RD YEAR.pptx
Priyanshu Anand
 
PDF
Exploring-the-Investigative-World-of-Science.pdf/8th class curiosity/1st chap...
Sandeep Swamy
 
PPTX
I INCLUDED THIS TOPIC IS INTELLIGENCE DEFINITION, MEANING, INDIVIDUAL DIFFERE...
parmarjuli1412
 
PPTX
The Future of Artificial Intelligence Opportunities and Risks Ahead
vaghelajayendra784
 
DOCX
Unit 5: Speech-language and swallowing disorders
JELLA VISHNU DURGA PRASAD
 
PPTX
INTESTINALPARASITES OR WORM INFESTATIONS.pptx
PRADEEP ABOTHU
 
PPTX
Python-Application-in-Drug-Design by R D Jawarkar.pptx
Rahul Jawarkar
 
DOCX
Modul Ajar Deep Learning Bahasa Inggris Kelas 11 Terbaru 2025
wahyurestu63
 
PPTX
Introduction to pediatric nursing in 5th Sem..pptx
AneetaSharma15
 
PPTX
Gupta Art & Architecture Temple and Sculptures.pptx
Virag Sontakke
 
PPTX
Continental Accounting in Odoo 18 - Odoo Slides
Celine George
 
PPTX
Command Palatte in Odoo 18.1 Spreadsheet - Odoo Slides
Celine George
 
PDF
John Keats introduction and list of his important works
vatsalacpr
 
PPTX
YSPH VMOC Special Report - Measles Outbreak Southwest US 7-20-2025.pptx
Yale School of Public Health - The Virtual Medical Operations Center (VMOC)
 
PPTX
Applied-Statistics-1.pptx hardiba zalaaa
hardizala899
 
PPTX
Various Psychological tests: challenges and contemporary trends in psychologi...
santoshmohalik1
 
PDF
Virat Kohli- the Pride of Indian cricket
kushpar147
 
PPTX
Unlock the Power of Cursor AI: MuleSoft Integrations
Veera Pallapu
 
PDF
EXCRETION-STRUCTURE OF NEPHRON,URINE FORMATION
raviralanaresh2
 
PPTX
Artificial Intelligence in Gastroentrology: Advancements and Future Presprec...
AyanHossain
 
HEALTH CARE DELIVERY SYSTEM - UNIT 2 - GNM 3RD YEAR.pptx
Priyanshu Anand
 
Exploring-the-Investigative-World-of-Science.pdf/8th class curiosity/1st chap...
Sandeep Swamy
 
I INCLUDED THIS TOPIC IS INTELLIGENCE DEFINITION, MEANING, INDIVIDUAL DIFFERE...
parmarjuli1412
 
The Future of Artificial Intelligence Opportunities and Risks Ahead
vaghelajayendra784
 
Unit 5: Speech-language and swallowing disorders
JELLA VISHNU DURGA PRASAD
 
INTESTINALPARASITES OR WORM INFESTATIONS.pptx
PRADEEP ABOTHU
 
Python-Application-in-Drug-Design by R D Jawarkar.pptx
Rahul Jawarkar
 
Modul Ajar Deep Learning Bahasa Inggris Kelas 11 Terbaru 2025
wahyurestu63
 
Introduction to pediatric nursing in 5th Sem..pptx
AneetaSharma15
 
Gupta Art & Architecture Temple and Sculptures.pptx
Virag Sontakke
 
Continental Accounting in Odoo 18 - Odoo Slides
Celine George
 
Command Palatte in Odoo 18.1 Spreadsheet - Odoo Slides
Celine George
 
John Keats introduction and list of his important works
vatsalacpr
 
YSPH VMOC Special Report - Measles Outbreak Southwest US 7-20-2025.pptx
Yale School of Public Health - The Virtual Medical Operations Center (VMOC)
 
Applied-Statistics-1.pptx hardiba zalaaa
hardizala899
 
Various Psychological tests: challenges and contemporary trends in psychologi...
santoshmohalik1
 
Virat Kohli- the Pride of Indian cricket
kushpar147
 
Unlock the Power of Cursor AI: MuleSoft Integrations
Veera Pallapu
 
EXCRETION-STRUCTURE OF NEPHRON,URINE FORMATION
raviralanaresh2
 
Artificial Intelligence in Gastroentrology: Advancements and Future Presprec...
AyanHossain
 

Hadoop Installation

  • 1. A GUIDE TO HADOOP INSTALLATION BY WizIQ
  • 2. Hadoop Installation • Putty connectivity • Java Installation (openjdk-7-jdk) • Group/User Creation &SSH Certification • Hadoop Install (Hadoop 2.2.0) • Hadoop configuration • Hadoop Services
  • 3. Putty Connectivity User ununtu@<name>  Make sure you have SSHAuth browse to .ppk file(location)
  • 5. Java Installation • ubuntu@ip-10-45-133-21:~$ sudo apt-get install openjdk-7-jdk • Error : Err http://us-east- 1.ec2.archive.ubuntu.com/ubuntu/ precise- updates/main liblvm2app2.2 amd64 2.02.66-4ubuntu7.1 403 Forbidden] …… • Solution: ubuntu@ip-10-45-133-21:~$ sudo apt-get update ubuntu@ip-10-45-133-21:~$ sudo apt-get install openjdk-7-jdk ubuntu@ip-10-45-133-21:~$ cd /usr/lib/jvm • Error: ubuntu@ip-10-45-133-21:/usr/lib/jvm$ ln -s java-7-openjdk-amd64 jdk ln: failed to create symbolic link `jdk': Permission denied ubuntu@ip-10-45-133-21:/usr/lib/jvm$ • Solution: ubuntu@ip-10-45-133-21:/usr/lib/jvm$sudo su root@ip-10-45-133-21:/usr/lib/jvm# ln -s java- 7-openjdk-amd64 jdk
  • 7. Group/User Creation &SSH Certification • ubuntu@ip-10-45-133-21$ sudo addgroup hadoop • ubuntu@ip-10-45-133-21$ sudo adduser --ingroup hadoop hduser • ubuntu@ip-10-45-133-21$ sudo adduser hduser sudo **After user is created, re-login into ubuntu using hduser ubuntu@ip-10-45-133-21:~$ su –l hduser Password: Setup SSH Certificate hduser@ip-10-45-133-21$ ssh-keygen -t rsa -P '' ... Your identification has been saved in /home/hduser/.ssh/id_rsa. Your public key has been saved in /home/hduser/.ssh/id_rsa.pub. ... hduser@ip-10-45-133-21$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys hduser@ip-10-45-133-21$ ssh localhost **
  • 8. **Note: I have ignored the password here for %id_rsa:
  • 9. Hadoop Install(Hadoop 2.2.0) • hduser@ip-10-45-133-21:~# su -l hduser • hduser@ip-10-45-133-21$ cd ~ • hduser@ip-10-45-133-21$ wget http://www.trieuvan.com/apache/hadoop/c ommon/hadoop- hduser@ip-10-45-133- 212.2.0/hadoop-2.2.0.tar.gz • hduser@ip-10-45-133-21$ sudo tar vxzf hadoop-2.2.0.tar.gz -C /usr/local • hduser@ip-10-45-133-21$ cd /usr/local • hduser@ip-10-45-133-21$ sudo mv hadoop- 2.2.0 hadoop • hduser@ip-10-45-133-21$ sudo chown -R hduser:hadoop hadoop
  • 10. Hadoop Configuration • Setup Hadoop Environment Variables: • hduser@ip-10-45-133-21$cd ~ • hduser@ip-10-45-133-21$vi .bashrc Copy & Paste following to the end of the file: #Hadoop variables export JAVA_HOME=/usr/lib/jvm/jdk/ export HADOOP_INSTALL=/usr/local/hadoop export PATH=$PATH:$HADOOP_INSTALL/bin export PATH=$PATH:$HADOOP_INSTALL/sbin export HADOOP_MAPRED_HOME=$HADOOP_INSTALL export HADOOP_COMMON_HOME=$HADOOP_INSTALL export HADOOP_HDFS_HOME=$HADOOP_INSTALL export YARN_HOME=$HADOOP_INSTALL ###end of paste hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop hduser@ip-10-45-133-21$ vi hadoop-env.sh #modify JAVA_HOME export JAVA_HOME=/usr/lib/jvm/jdk/
  • 11. ****Re-login into Ubuntu using hdser and check hadoop version****** hduser@ip-10-45-133-21$ hadoop version Hadoop 2.2.0 Subversion https://svn.apache.org/repos/asf/hadoop/ common -r 1529768 Compiled by hortonmu on 2013-10- 07T06:28Z Compiled with protoc 2.5.0 From source with checksum 79e53ce7994d1628b240f09af91e1af4 This command was run using /usr/local/hadoop- 2.2.0/share/hadoop/common/hadoop- common-2.2.0.jar At this point, Hadoop is installed.
  • 12. Configure Hadoop : hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop hduser@ip-10-45-133-21$ vi core-site.xml
  • 13. #Paste following between <configuration> <property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> </property> hduser@ip-10-45-133-21$ vi yarn-site.xml #Paste following between <configuration> <property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property> <property> <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name> <value>org.apache.hadoop.mapred.ShuffleHandler</value> </property> hduser@ip-10-45-133-21$ mv mapred-site.xml.template mapred-site.xml hduser@ip-10-45-133-21$ vi mapred-site.xml
  • 14. • #Paste following between <configuration> <property> <name>mapreduce.framework.name</name> <value>yarn</value> </property> hduser@ip-10-45-133-21$ cd ~ hduser@ip-10-45-133-21$ mkdir -p mydata/hdfs/namenode hduser@ip-10-45-133-21$ mkdir -p mydata/hdfs/datanode hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop hduser@ip-10-45-133-21$ vi hdfs-site.xml • #Paste following between <configuration> <property> <name>dfs.replication</name> <value>1</value> </property> <property> <name>dfs.namenode.name.dir</name> <value>file:/home/hduser/mydata/hdfs/namenode</value> </property> <property> <name>dfs.datanode.data.dir</name> <value>file:/home/hduser/mydata/hdfs/datanode</value> </property>
  • 17. Tried checking again: 1) SSH Certification creation 2) ssh localhost 3) vi hadoop-env.sh 4) vi hdfs-site.xml 5) Namenode and datanode directories 6) reformatted namenode 7) Started the services That’s it after that it started working as below screens. Not sure what fixed the issue as observed in next slide • Error1: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable …. • Error2: Localhost: Permission Denied (Publickey)…
  • 19. • hduser@ip-10-45-133-21:~$ jps • All went on successfully and could see below services running.
  • 20. • Reconnected and works fine
  • 21. THANK YOU! Enroll now for Hadoop and Big Data Training @ WizIQ.com Visit: http://www.wiziq.com/course/21308-hadoop-big-data-training For more information, feel free to contact us at [email protected]