Hive hive.aux.jars.path/HIVE_AUX_JARS_PATH setting

Hi,
Is there a way to provide aux jars path when setting up hive connection? either by adding hive.aux.jars.path/HIVE_AUX_JARS_PATH ?

@smora

Yes, you should be able to by adding below to hive-site.xml

<name>hive.aux.jars.path</name>
  <value>/etc/hadoop-2.8.3/share/hadoop/tools/lib,/etc/apache-hive-1.2.2-bin/lib</value>
</property> 

What is it you are trying to add?

@balaji.ramaswamy

Thanks Balaji. I am trying to connect to access hive jdbc external table from dremio.
dremio is running outside hadoop/hive cluster. jdbc jar i am using is not available on hive default config dir.
using hive client, i can add jar using --auxpath option. to be able to add jar from dremio, i need to copy jar to dremio local storage and set hive.aux.jars.path. this seem to work.

I have another issue now, hive configuration is successful using hdfs kerberos principal. but when i try to access any table getting the following error, Any suggestion on this?

java.io.IOException: Can’t get Master Kerberos principal for use as renewer

1 Like

Hi @smora
How do you solve the first problem?
Like export HIVE_AUX_JARS_PATH=/opt/dremio/hive, where hive is just a dir you made to store JARS in it?
Also, the hive-site.xml should be added in /opt/dremio/conf?

Thanks in advance,
Rosario

@0iras0r

if the custom jar cannot be added to hive default jars path then you can follow this approach
copy jar on to dremio master/executor nodes and when setting up hive connection you can add hive.aux.jars.paths property pointing to your jar
ex: hive.aux.jars.paths=/opt/dremio/jars/hive-jdbc-handler-2.3.3.jar, /opt/dremio/jars/dremio-jdbc-driver-4.1.0-201912030136020081-49feeb75.jar

yes, hive-site.xml need to be added to dremio conf

1 Like

I will try that,
just to avoid misunderstanding and understand more clearly, the jars you provided in your answers are the correct one for your case or they need to be added in every situation when you have external table in Hive? I have hive with phoenix as storage handler for external tables.
Also what is your situation exactly? I’d like to help you for your kerberos problems, usually I have no problems in a kerberized hadoop system.

Thanks @smora !

Thanks you very much for your previous hint! =)

@0iras0r
thats correct jars i mentioned are specific to my environment, in your case it will be your phoenix related.

issue i have with hive is, my connection is setup and i can see the database&tables. but when i try to access a table under a database i get the below error in dremio UI

java.io.IOException: Can’t get Master Kerberos principal for use as renewer

@smora

Kerberos messages are very high level and cryptic. There could be a variety of issues ranging from, expired ticket to not valid permissions. One way is to check permission, do a klist ans see if it has not expired etc. If basic things looks good, enable Kerberos debug and see if that helps

Here is an article that might help

https://docs.dremio.com/knowledge-base/kerberos-trouble.html

To turn on debug for Kerberos

To enable Kerberos debugging through dremio-env (not true on executors if Yarn deployment), restart coordinator

DREMIO_JAVA_EXTRA_OPTS=-Dsun.security.krb5.debug= true -Dsun.security.spnego.debug= true -Djavax.net.debug=all

To enable Kerberos for executors if Yarn deployment

Open Dremio UI-Admin-Provisioning-Edit (use pencil icon)

Note: Provision only one container until problem is solved to reduce startup and shutdown time of containers

Under Additional properties, click add option and enter the 3 below,

Type: java

Name: sun.security.krb5.debug

value: true

Type: java

Name: sun.security.spnego.debug

value: true

Type: java

Name: javax.net.debug

value: all

Save and restart Yarn containers

thanks @balaji.ramaswamy its working after adding mapred-site and yarn-site files.

I still receive errors… @smora, @balaji.ramaswamy
Error in loading storage handler.org.apache.phoenix.hive.PhoenixStorageHandler

I tried to add the hive.aux.jars.path in hive-site.xml like that:

hive.aux.jars.path
/opt/dremio/auxPath/hive-jdbc-3.1.0.3.1.4.0-315-standalone.jar,/opt/dremio/auxPath/hive-jdbc-3.1.0.3.1.4.0-315.jar,/opt/dremio/auxPath/hive-jdbc-handler-3.1.0.3.1.4.0-315.jar,/opt/dremio/auxPath/phoenix-5.0.0.3.1.4.0-315-hive.jar

The thing is I think that I don’t know how to check if Dremio got the configuration right form hive-site.xml or if I need to do something else.
In DBeaver for example I was able to get data directly with only the hive-jdbc connector.
Why I got problem even if I add the correct jars?
Should I missing something?
I think it’s not a matter of phoenix but how properly set hive for external table with different storage handler.

Here more logs:
INFO c.d.e.s.h.metadata.HiveMetadataUtils - User Error Occurred [ErrorId: d574eeca-75fe-42c9-b19d-37372a0442ed]
com.dremio.common.exceptions.UserException: Error in loading storage handler.org.apache.phoenix.hive.PhoenixStorageHandler
at com.dremio.common.exceptions.UserException$Builder.build(UserException.java:776) ~[dremio-common-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.store.hive.metadata.HiveMetadataUtils.getInputFormatClass(HiveMetadataUtils.java:1168) [dremio-ce-hive3-plugin-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.store.hive.metadata.HiveMetadataUtils.getInputFormat(HiveMetadataUtils.java:166) [dremio-ce-hive3-plugin-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.store.hive.metadata.HiveMetadataUtils.getTableMetadata(HiveMetadataUtils.java:357) [dremio-ce-hive3-plugin-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.store.hive.Hive3StoragePlugin.listPartitionChunks(Hive3StoragePlugin.java:559) [dremio-ce-hive3-plugin-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.catalog.DatasetSaver.save(DatasetSaver.java:96) [dremio-sabot-kernel-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.catalog.DatasetSaver.save(DatasetSaver.java:154) [dremio-sabot-kernel-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.catalog.DatasetManager.getTableFromPlugin(DatasetManager.java:349) [dremio-sabot-kernel-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.catalog.DatasetManager.getTable(DatasetManager.java:209) [dremio-sabot-kernel-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.catalog.CatalogImpl.getTable(CatalogImpl.java:130) [dremio-sabot-kernel-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.catalog.SourceAccessChecker.lambda$getTable$3(SourceAccessChecker.java:103) [dremio-sabot-kernel-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.catalog.SourceAccessChecker.checkAndGetTable(SourceAccessChecker.java:82) [dremio-sabot-kernel-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.catalog.SourceAccessChecker.getTable(SourceAccessChecker.java:103) [dremio-sabot-kernel-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.catalog.DelegatingCatalog.getTable(DelegatingCatalog.java:66) ~[dremio-sabot-kernel-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.catalog.CachingCatalog.getTable(CachingCatalog.java:93) ~[dremio-sabot-kernel-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_232]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_232]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_232]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_232]
at org.glassfish.hk2.utilities.reflection.ReflectionHelper.invoke(ReflectionHelper.java:1268) ~[hk2-utils-2.6.1.jar:na]
at org.jvnet.hk2.internal.MethodInterceptorImpl.internalInvoke(MethodInterceptorImpl.java:85) ~[hk2-locator-2.6.1.jar:na]
at org.jvnet.hk2.internal.MethodInterceptorImpl.invoke(MethodInterceptorImpl.java:101) ~[hk2-locator-2.6.1.jar:na]
at org.jvnet.hk2.internal.MethodInterceptorInvocationHandler.invoke(MethodInterceptorInvocationHandler.java:39) ~[hk2-locator-2.6.1.jar:na]
at com.sun.proxy.$Proxy116.getTable(Unknown Source) ~[na:na]
at com.dremio.dac.explore.DatasetsResource.getDatasetSummary(DatasetsResource.java:258) ~[dremio-dac-backend-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.dac.explore.DatasetsResource.newUntitled(DatasetsResource.java:134) ~[dremio-dac-backend-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.dac.explore.DatasetsResource.newUntitledFromParent(DatasetsResource.java:196) ~[dremio-dac-backend-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_232]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_232]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_232]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_232]
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52) ~[jersey-server-2.29.1.jar:na]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:124) ~[jersey-server-2.29.1.jar:na]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:167) ~[jersey-server-2.29.1.jar:na]
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:219) ~[jersey-server-2.29.1.jar:na]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:79) ~[jersey-server-2.29.1.jar:na]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:469) ~[jersey-server-2.29.1.jar:na]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:391) ~[jersey-server-2.29.1.jar:na]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:80) ~[jersey-server-2.29.1.jar:na]
at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:253) ~[jersey-server-2.29.1.jar:na]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248) ~[jersey-common-2.29.1.jar:na]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244) ~[jersey-common-2.29.1.jar:na]
at org.glassfish.jersey.internal.Errors.process(Errors.java:292) ~[jersey-common-2.29.1.jar:na]
at org.glassfish.jersey.internal.Errors.process(Errors.java:274) ~[jersey-common-2.29.1.jar:na]
at org.glassfish.jersey.internal.Errors.process(Errors.java:244) ~[jersey-common-2.29.1.jar:na]
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265) ~[jersey-common-2.29.1.jar:na]
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:232) ~[jersey-server-2.29.1.jar:na]
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:680) ~[jersey-server-2.29.1.jar:na]
at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:392) ~[jersey-container-servlet-core-2.29.1.jar:na]
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:346) ~[jersey-container-servlet-core-2.29.1.jar:na]
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:365) ~[jersey-container-servlet-core-2.29.1.jar:na]
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:318) ~[jersey-container-servlet-core-2.29.1.jar:na]
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:205) ~[jersey-container-servlet-core-2.29.1.jar:na]
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:760) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1617) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at com.dremio.dac.server.SecurityHeadersFilter.doFilter(SecurityHeadersFilter.java:52) ~[dremio-dac-backend-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.servlets.GzipFilter.doFilter(GzipFilter.java:50) ~[jetty-servlets-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1296) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1211) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.RequestLogHandler.handle(RequestLogHandler.java:54) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.Server.handle(Server.java:500) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:386) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:560) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:378) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:268) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) ~[jetty-io-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103) ~[jetty-io-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117) ~[jetty-io-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:367) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:782) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:914) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at java.lang.Thread.run(Thread.java:748) ~[na:1.8.0_232]
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Error in loading storage handler.org.apache.phoenix.hive.PhoenixStorageHandler
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getStorageHandler(HiveUtils.java:308) ~[dremio-ce-hive3-plugin-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
at com.dremio.exec.store.hive.metadata.HiveMetadataUtils.getInputFormatClass(HiveMetadataUtils.java:1164) [dremio-ce-hive3-plugin-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
… 83 common frames omitted
Caused by: java.lang.ClassNotFoundException: org.apache.phoenix.hive.PhoenixStorageHandler
at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[na:1.8.0_232]
at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[na:1.8.0_232]
at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[na:1.8.0_232]
at org.pf4j.PluginClassLoader.loadClass(PluginClassLoader.java:126) ~[pf4j-3.0.1.jar:3.0.1]
at java.lang.Class.forName0(Native Method) ~[na:1.8.0_232]
at java.lang.Class.forName(Class.java:348) ~[na:1.8.0_232]
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getStorageHandler(HiveUtils.java:303) ~[dremio-ce-hive3-plugin-4.1.3-202001022113020736-53142377.jar:4.1.3-202001022113020736-53142377]
… 84 common frames omitted

Thanks in advance
Rosario

@0iras0r

this is what my observation is with accessing hive jdbc tables via dremio

without jdbc jar in aux.jars.paths:
you will get

cannot find jar error …

with jdbc jar available locally to dremio or on hive default location, you may still get error like below

Error in loading storage handler.org.apache.hive.storage.jdbc.JdbcStorageHandler

as per dremio documentation it seems dremio hive connection works only for hive local tables, ie by directly accessing hdfs files via hive, in my case after adding all jars i get error like below

cannot convert <your-stroage hadnler> to hive-storage handler

it works for hive external tables which is based on files sitting on hdfs storage but not for jdbc type table, maybe @balaji.ramaswamy can comment more on it

@0iras0r

Currently we do not support the Phoenix storage handler plugin

Thanks
@balaji.ramaswamy