Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles / web / Apache

Install and Configure Apache Hbase on Ubuntu 16.04

0.00/5 (No votes)
14 Aug 2017CPOL1 min read 10.7K  
This articles details out instructions to install HBase on Linux - Ubuntu 16.04

Install Prerequisites

  1. Install hadoop (Instructions in this blog)
  2. Install zookeeper (Use this blog)

Add Unix User and Set Environment

I keep all my hadoop environment variables in one single file "hadoopenv" and this is sourced to all the accounts.

My hadoopenv file (note that this file is not in hbase user home):

kamal@kamal-Lenovo-G505:~$ cat hadoopenv
#HADOOP VARIABLES START
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HADOOP_INSTALL=/usr/local/hadoop
export HADOOP_CONF_DIR=$HADOOP_INSTALL/etc/hadoop
export YARN_CONF_DIR=$HADOOP_INSTALL/etc/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_INSTALL/lib/native"
export JAVA_LIBRARY_PATH=$HADOOP_INSTALL/lib/native
export LD_LIBRARY_PATH=$HADOOP_INSTALL/lib/native:$LD_LIBRARY_PATH
export YARN_EXAMPLES=$HADOOP_INSTALL/share/hadoop/mapreduce
export HADOOP_MAPRED_STOP_TIMEOUT=30
export YARN_STOP_TIMEOUT=30
#HADOOP VARIABLES END

#PIG VARIABLES
export PIG_HOME=/usr/local/pig
export PATH=$PATH:$PIG_HOME/bin
export PIG_CLASSPATH=$PIG_HOME/conf:$HADOOP_INSTALL/etc/hadoop
#PIG VARIABLES END
#HBASE_VARIABLES
export HBASE_HOME=/usr/local/hbase
export PATH=$PATH:$HBASE_HOME/bin
#HIVE_VARIABLES 
export HIVE_HOME=/usr/local/hive 
export PATH=$PATH:$HIVE_HOME/bin
#HIVE_VARIABLES export HIVE_HOME=/usr/local/hive export PATH=$PATH:$HIVE_HOME/bin

Create user Hbase:

sudo adduser --ingroup hadoop hbase

Edit .bashrc file for hbase user to source hadoopenv file (update the path as applicable to your system):

. /home/kamal/hadoopenv

Download and Extract hbase Binary

Update the URL below as per your mirror from Apache hive website.
cd Downloads
wget http://www-us.apache.org/dist/hbase/stable/hbase-1.2.6-bin.tar.gz
tar xvf hbase-1.2.6-bin.tar.gz
sudo mv hbase-1.2.6-bin.tar.gz /usr/local/hbase
cd /usr/local
sudo chown -R hbase:hadoop hbase 

Setup SSH

sudo su - hbase
ssh-keygen -t rsa -P ""
cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
ssh localhost 
exit

Configure Hbase

Configue hbase-env.sh present in $HBASE_HOME/conf

Login as hbase user and edit these files.

sudo su - hbase
vi $HBASE_HOME/conf/hbase-env.sh

Update below listed 3 parameters in hbase-env.sh file.

Last parameter tells that we have our own zookeeper setup, without this setting, Hbase will try to start its own zookeeper.

HBASE_CLASSPATH needs to be set to path for Hadoop configuration so that it can use our hdfs installation.

# The java implementation to use.  Java 1.7+ required.
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
# Extra Java CLASSPATH elements.  Optional.
export HBASE_CLASSPATH=/usr/local/hadoop/etc/hadoop
# Tell HBase whether it should manage it's own instance of Zookeeper or not.
export HBASE_MANAGES_ZK=false

Configure hbase-site.xml File Present in $HBASE_HOME/conf

Last 5 parameters in the file are to enable hbase use simple authentication. You will see the impact of this in the later part of this blog.

XML
<configuration>
<property>
  <name>hbase.cluster.distributed</name>
  <value>true</value>
</property>
<property>
  <name>hbase.rootdir</name>
  <value>hdfs://localhost:9000/user/hbase</value>
</property>
<property>
  <name>hbase.zookeeper.quorum</name>
  <value>localhost:2181,localhost:2182</value>
</property>
<property>
  <name>hbase.security.authentication</name>
  <value>simple</value>
</property>
<property>
  <name>hbase.security.authorization</name>
  <value>true</value>
</property>
<property>
  <name>hbase.coprocessor.master.classes</name>
  <value>org.apache.hadoop.hbase.security.access.AccessController</value>
</property>
<property>
  <name>hbase.coprocessor.region.classes</name>
  <value>org.apache.hadoop.hbase.security.access.AccessController</value>
</property>
<property>
  <name>hbase.coprocessor.regionserver.classes</name>
  <value>org.apache.hadoop.hbase.security.access.AccessController</value>
</property>
</configuration>

Start hbase

sudo su - hbase
start-hbase.sh
hbase-daemon.sh start thrift
exit 

Verify Installation

Let's login to hbase shell and create a table.

hbase shell
hbase(main):004:0> create 'test2', 'cf1'
ERROR: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions 
(user=kamal, scope=default, params=[namespace=default,table=default:test2,family=cf1],action=CREATE)
Here is some help for this command

So this did not work, but this is due to the fact that we have enabled authorization and authentication on hbase. We need to first logon as root and grant permissions to my user "kamal".

So we first logon as 'hbase', which is our admin for hbase. You see that we can create table as admin. So installation is working.
Next, we assign 'RWC' (Read, Write, Create) grant to "kamal".

sudo su - hbase
hbase shell
...
...
HBase Shell; enter 'help<RETURN>' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 1.2.6, rUnknown, Mon May 29 02:25:32 CDT 2017
hbase(main):001:0> create 'test2','cf'
0 row(s) in 5.2730 seconds
=> Hbase::Table - test2
hbase(main):002:0> grant 'kamal','RWC'
0 row(s) in 0.7230 seconds
hbase(main):003:0> quit
exit
logout

Now let's see if it works:

kamal@kamal-Lenovo-G505:/usr/local/hbase/conf$ hbase shell
...
...
HBase Shell; enter 'help<RETURN>' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 1.2.6, rUnknown, Mon May 29 02:25:32 CDT 2017
hbase(main):001:0> create 'test3','cf'
0 row(s) in 2.8460 seconds
=> Hbase::Table - test3
hbase(main):002:0> 

So there you have it, a working hbase installation.

Create start-stop script (run_hbase.sh)

. /hime/kamal/hadoopenv
case $1 in
start)
su -p - hbase -c $HBASE_HOME/bin/start-hbase.sh
su -p - hbase -c "$HBASE_HOME/bin/hbase-daemon.sh start thrift"
;;

stop)
su -p - hbase -c $HBASE_HOME/bin/stop-hbase.sh
su -p - hbase -c "$HBASE_HOME/bin/hbase-daemon.sh stop thrift"
;;

esac 

Script can be used as:

sudo ./run_hbase.sh start
sudo ./run_hbase.sh stop 

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)