Failed to create a table in Hive backed by HBase ?

Failed to create a table in Hive backed by HBase. Will it fix the error to copy hbase-site.xml into /etc/hive/conf/ ?

Often when you use Impala with Hbase and first you need create metastore from Hive.

$ hive –auxpath /opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p251.30/lib/hive/lib/zookeeper.jar,/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p251.30/lib/hive/lib/hbase.jar,/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p251.30/lib/hive/lib/hive-hbase-handler-0.10.0-cdh4.5.0.jar,/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p251.30/lib/hive/lib/guava-11.0.2.jar

hive > CREATE EXTERNAL TABLE
dsp_test(…)
STORED BY
‘org.apache.hadoop.hive.hbase.HBaseStorageHandler’
WITH SERDEPROPERTIES (
“hbase.columns.mapping” = “…”
)
TBLPROPERTIES (
“hbase.table.name” = “dsp_ratestradesseod”
);

Error : where the hbase table “dsp_ratestradesseod” already exists.
And you might get following failure:

FAILED: Error in metadata: MetaException(message:org.apache.hadoop.hbase.MasterNotRunningException: Retried 10 times
at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:138)
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:73)
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:147)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:428)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
at $Proxy9.createTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:576)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3719)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:254)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:66)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1383)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1169)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:982)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:902)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
Solution :-
————

To create and use Hive-HBase integrated table, following parameters are required,
– dependency jar files: zookeeper.jar, hbase.jar, hive-hbase-handler.jar, guava-11.0.2.jar
– zookeeper quorum
– Security settings: master principal, regionserver principal, etc..

These parameters will be configured thru ClouderaManager, and –auxpath and -hiveconf will not requied to invokie Hive CLI once the configuration is completed.
As an temporory solution, you can use following command on Fajita.

hive –auxpath /opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p251.30/lib/hive/lib/zookeeper.jar,\
/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p251.30/lib/hive/lib/hbase.jar,\
/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p251.30/lib/hive/lib/hive-hbase-handler-0.10.0-cdh4.5.0.jar,\
/opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p251.30/lib/hive/lib/guava-11.0.2.jar \
-hiveconf hbase.security.authentication=kerberos \
-hiveconf hbase.rpc.engine=org.apache.hadoop.hbase.ipc.SecureRpcEngine \
-hiveconf hbase.master.kerberos.principal=hbase/_HOST@NAMUXDEV.DYN.DBVERSITY.COM \
-hiveconf hbase.regionserver.kerberos.principal=hbase/_HOST@NAMUXDEV.DYN.DBVERSITY.COM \
-hiveconf hbase.zookeeper.quorum=bdgtmaster01i1d.apac.dbversity.com,bdgtmaster02i1d.apac.dbversity.com,bdgtmaster03h1d.apac.dbversity.com

  • Ask Question