Hi Dremio team,
im able to configure to connect to hive but when trying to access the table im getting “Access denied reading dataset hive.xxx”
we are using MapR 6.x!
from server.log
Caused by: java.lang.RuntimeException: Unable to connect to Hive metastore.
at com.dremio.exec.store.hive.HiveStoragePlugin.hasAccessPermission(HiveStoragePlugin.java:200) ~[na:na]
at com.dremio.exec.catalog.PermissionCheckCache.lambda$hasAccess$0(PermissionCheckCache.java:97) [dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4904) ~[guava-20.0.jar:na]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628) ~[guava-20.0.jar:na]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336) ~[guava-20.0.jar:na]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295) ~[guava-20.0.jar:na]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208) ~[guava-20.0.jar:na]
at com.google.common.cache.LocalCache.get(LocalCache.java:4053) ~[guava-20.0.jar:na]
at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899) ~[guava-20.0.jar:na]
at com.dremio.exec.catalog.PermissionCheckCache.getFromPermissionsCache(PermissionCheckCache.java:143) [dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.exec.catalog.PermissionCheckCache.hasAccess(PermissionCheckCache.java:107) [dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
… 79 common frames omitted
Caused by: org.apache.hadoop.hive.metastore.api.MetaException: java.lang.RuntimeException: Error getting metastore password: null
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:51543) ~[na:na]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:51520) ~[na:na]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:51451) ~[na:na]
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86) ~[na:na]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1466) ~[na:na]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1452) ~[na:na]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1332) ~[hive-metastore-2.1.1-mapr-1803-201908200826380359-c333a66.jar:2.1.1-mapr-1803-201908200826380359-c333a66]
at com.dremio.exec.store.hive.HiveClient$5.run(HiveClient.java:221) ~[na:na]
at com.dremio.exec.store.hive.HiveClient$5.run(HiveClient.java:217) ~[na:na]
at com.dremio.exec.store.hive.HiveClient.doCommand(HiveClient.java:337) ~[na:na]
at com.dremio.exec.store.hive.HiveClient.getTableWithoutTableTypeChecking(HiveClient.java:217) ~[na:na]
at com.dremio.exec.store.hive.HiveClient.getTable(HiveClient.java:230) ~[na:na]
at com.dremio.exec.store.hive.HiveClientWithAuthz.getTable(HiveClientWithAuthz.java:131) ~[na:na]
at com.dremio.exec.store.hive.HiveStoragePlugin.hasAccessPermission(HiveStoragePlugin.java:183) ~[na:na]
… 89 common frames omitted
@ronniearangali
Does the Dremio service user have access to the files in MapR?
yes.
[root@uat-r7-bdmapr601 ~]# cat /etc/sysconfig/dremio | grep USER
DREMIO_USER=mapr
[root@uat-r7-bdmapr601 ~]#
@ronniearangali
Have you tried to restart your coordinator. I am seeing some permission cache errors, just wanted to try to see if a restart fixes it and if it does, will work internally here to address it
Thanks
@balaji.ramaswamy
yes, restarted several times, i even restarted my servers.
@ronniearangali
Let’s try this,
Can you send me “hadoop fs -ls” of the full path to the file on MapR FS and also the parent folders? Which user are you logged into Dremio as?
[mapr@uat-r7-bdmapr601 ~]$ hadoop fs -ls /
Found 19 items
drwxr-xr-x - mapr mapr 2 2018-05-30 15:00 /apps
drwx------ - mapr mapr 5 2019-11-20 14:41 /cache
drwxr-xr-x - mapr mapr 4 2018-12-17 17:49 /data
drwxr-xr-x - mapr mapr 7 2019-08-28 09:42 /datalake
drwxr-xr-x - mapr mapr 1 2019-11-20 13:37 /dremio
drwxr-xr-x - mapr mapr 3 2018-04-27 10:32 /installer
drwxr-xr-x - mapr mapr 51 2018-12-07 18:49 /ntt_scripts
drwxr-xr-x - mapr mapr 3 2018-06-05 11:03 /oozie
drwxr-xr-x - mapr mapr 0 2018-04-27 10:29 /opt
drwxr-xr-x - mapr mapr 4 2018-07-13 17:03 /sample_files
drwxr-xr-x - mapr mapr 13 2019-10-31 17:40 /software
drwxr-xr-x - mapr mapr 9 2019-09-25 08:08 /splice-hbase
drwxr-xr-x - mapr mapr 3 2018-12-17 17:49 /status
drwxrwxr-x - mapr hueuser 6 2018-08-30 11:26 /testace
drwxrwxr-x - mapr mapr 0 2018-06-29 10:14 /testdir
drwxrwxrwx - mapr mapr 3 2019-11-21 09:31 /tmp
drwxr-xr-x - mapr mapr 10 2018-12-13 16:23 /user
drwxr-xr-x - mapr mapr 1 2018-04-27 10:27 /var
-rw-rw-rw- 3 mapr mapr 0 2018-05-31 16:35 /yarn-mapr-scheduling-debug.log
[mapr@uat-r7-bdmapr601 ~]$
@ronniearangali
Can you work down the trree and send us the outputs?
like hadoop fs -ls /
hadoop fs -ls/
hadoop fs -ls//
hadoop fs -ls///
will folder you need me to drill down?
Sorry, I gave examples and somehow the application trimmed it all off
hadoop fs -ls/folder1
hadoop fs -ls/folder1/folder2
hadoop fs -ls/folder1/folder2/actual_file
[mapr@uat-r7-bdmapr601 ~]$ hadoop fs -ls /datalake
drwxrwsr-x - maprsys bigdata 2 2018-08-21 13:30 /datalake/COLD
drwxr-sr-x - maprsys bigdata 9 2019-06-13 11:15 /datalake/RAW
drwxrwsr-x - maprsys bigdata 11 2019-11-15 16:49 /datalake/WORK
drwxrwxr-x - 5001 bigdata 3 2019-09-09 11:14 /datalake/backup
drwxr-xr-x - mapr mapr 34 2019-09-05 14:11 /datalake/hive
[mapr@uat-r7-bdmapr601 ~]$ hadoop fs -ls /datalake/hive
drwxr-xr-x - mapr mapr 221 2018-06-29 01:39 /datalake/hive/DWH.db
[mapr@uat-r7-bdmapr601 ~]$
@ronniearangali
Permissions look ok. I did not get the permissions for the table under “DWH.db” and the PARQUET files under the table. Send that too if you can. Also via the Dremio UI, can you click on “DWH.db” and then click the copy icon next to the table name. Then do a “New Query”, select * from “paste-what-is-in clipboard”. This would generate a failed job. Send us the profile
Share a query profile
Thanks
@balaji.ramaswamy
[root@uat-r7-bdmapr601 3rdparty]# tail -100f /var/log/dremio/server.log
at org.eclipse.jetty.server.handler.RequestLogHandler.handle(RequestLogHandler.java:54) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.Server.handle(Server.java:500) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:386) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:560) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:378) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:268) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) ~[jetty-io-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103) ~[jetty-io-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117) ~[jetty-io-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:367) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:782) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:914) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at java.lang.Thread.run(Thread.java:748) ~[na:1.8.0_161]
2019-11-21 23:44:21,638 [qtp1370009304-119] INFO c.d.e.catalog.ManagedStoragePlugin - User Error Occurred [ErrorId: c4865b2d-0cf3-46cc-aff2-79610d5a348e]
**com.dremio.common.exceptions.UserException: Access denied reading dataset hive.dwh.active_ret_final.**
at com.dremio.common.exceptions.UserException$Builder.build(UserException.java:776) ~[dremio-common-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.exec.catalog.ManagedStoragePlugin.checkAccess(ManagedStoragePlugin.java:395) [dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.exec.catalog.DatasetManager.getTableFromPlugin(DatasetManager.java:373) [dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.exec.catalog.DatasetManager.getTable(DatasetManager.java:208) [dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.exec.catalog.CatalogImpl.getTable(CatalogImpl.java:130) [dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.exec.catalog.SourceAccessChecker.lambda$getTable$3(SourceAccessChecker.java:103) [dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.exec.catalog.SourceAccessChecker.checkAndGetTable(SourceAccessChecker.java:82) [dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.exec.catalog.SourceAccessChecker.getTable(SourceAccessChecker.java:103) [dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.exec.catalog.DelegatingCatalog.getTable(DelegatingCatalog.java:66) ~[dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.exec.catalog.CachingCatalog.getTable(CachingCatalog.java:93) ~[dremio-sabot-kernel-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_161]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_161]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_161]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_161]
at org.glassfish.hk2.utilities.reflection.ReflectionHelper.invoke(ReflectionHelper.java:1287) ~[hk2-utils-2.5.0-b32.jar:na]
at org.jvnet.hk2.internal.MethodInterceptorImpl.internalInvoke(MethodInterceptorImpl.java:109) ~[hk2-locator-2.5.0-b32.jar:na]
at org.jvnet.hk2.internal.MethodInterceptorImpl.invoke(MethodInterceptorImpl.java:125) ~[hk2-locator-2.5.0-b32.jar:na]
at org.jvnet.hk2.internal.MethodInterceptorInvocationHandler.invoke(MethodInterceptorInvocationHandler.java:62) ~[hk2-locator-2.5.0-b32.jar:na]
at com.sun.proxy.$Proxy124.getTable(Unknown Source) ~[na:na]
at com.dremio.dac.explore.DatasetsResource.getDatasetSummary(DatasetsResource.java:258) ~[dremio-dac-backend-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.dac.explore.DatasetsResource.newUntitled(DatasetsResource.java:134) ~[dremio-dac-backend-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at com.dremio.dac.explore.DatasetsResource.newUntitledFromParent(DatasetsResource.java:196) ~[dremio-dac-backend-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_161]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_161]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_161]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_161]
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) ~[jersey-server-2.25.1.jar:na]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144) ~[jersey-server-2.25.1.jar:na]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161) ~[jersey-server-2.25.1.jar:na]
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:205) ~[jersey-server-2.25.1.jar:na]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99) ~[jersey-server-2.25.1.jar:na]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389) ~[jersey-server-2.25.1.jar:na]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347) ~[jersey-server-2.25.1.jar:na]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102) ~[jersey-server-2.25.1.jar:na]
at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:326) ~[jersey-server-2.25.1.jar:na]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) ~[jersey-common-2.25.1.jar:na]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) ~[jersey-common-2.25.1.jar:na]
at org.glassfish.jersey.internal.Errors.process(Errors.java:315) ~[jersey-common-2.25.1.jar:na]
at org.glassfish.jersey.internal.Errors.process(Errors.java:297) ~[jersey-common-2.25.1.jar:na]
at org.glassfish.jersey.internal.Errors.process(Errors.java:267) ~[jersey-common-2.25.1.jar:na]
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317) ~[jersey-common-2.25.1.jar:na]
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:305) ~[jersey-server-2.25.1.jar:na]
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1154) ~[jersey-server-2.25.1.jar:na]
at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:473) ~[jersey-container-servlet-core-2.25.1.jar:na]
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:427) ~[jersey-container-servlet-core-2.25.1.jar:na]
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:388) ~[jersey-container-servlet-core-2.25.1.jar:na]
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:341) ~[jersey-container-servlet-core-2.25.1.jar:na]
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:228) ~[jersey-container-servlet-core-2.25.1.jar:na]
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:760) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1617) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at com.dremio.dac.server.SecurityHeadersFilter.doFilter(SecurityHeadersFilter.java:52) ~[dremio-dac-backend-4.0.4-201910212106180190-773b665-mapr.jar:4.0.4-201910212106180190-773b665-mapr]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.servlets.GzipFilter.doFilter(GzipFilter.java:50) ~[jetty-servlets-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1296) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485) ~[jetty-servlet-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1211) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.RequestLogHandler.handle(RequestLogHandler.java:54) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.Server.handle(Server.java:500) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:386) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:560) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:378) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:268) ~[jetty-server-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) ~[jetty-io-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103) ~[jetty-io-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117) ~[jetty-io-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:367) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:782) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:914) ~[jetty-util-9.4.21.v20190926.jar:9.4.21.v20190926]
at java.lang.Thread.run(Thread.java:748) ~[na:1.8.0_161]
@ronniearangali
Click on the jobs menu-click on the failed job-on the right side, you will see “download profile”, click that and you will see a .zip file downloaded. Send that over please
1d5bfda5-d5d3-41e6-aeee-c731e51773aa.zip (10.4 KB)
the attach profile is working when i query the mfs folder of hive.
9d899ab1-df8b-49cc-a809-37daf1240e5e.zip (3.7 KB)
accessing table through hive failed.
@ronniearangali
Have you any chance disabled impersonation on the MaprFS source or set it on the Hive source. Unfortunately we did not trap the exception in the profile and the server.log paste you sent does not have the full stack. Would you be able to run the Hive query one more time and send us the full server.log
Sorry for the back and forth
Thanks
Hi balaji,
Thanks for your help, since right now i can access the hive files using mapr-fs.
what is the advantage of accessing the data using hive compared to accessing the hive files from mapr fs.
if no diff then i prefer access directly the mapr-fs where the hive files are stored.
thanks!