Thanks for the tip, DREMIO_ADMIN_LOG_VERBOSITY
set to DEBUG
and I captured logs. The log file was over 161k lines but here is a capture of the end where I began to see Java errors:
2025-08-12 00:14:54,767 [shutdown-hook-0] DEBUG c.d.plugins.util.CloseableResource - Closing resource AmazonS3Client
2025-08-12 00:14:54,767 [shutdown-hook-0] DEBUG o.a.h.i.c.PoolingHttpClientConnectionManager - Connection manager is shutting down
2025-08-12 00:14:54,767 [shutdown-hook-0] DEBUG o.a.h.i.c.PoolingHttpClientConnectionManager - Connection manager shut down
2025-08-12 00:14:54,767 [shutdown-hook-0] DEBUG o.a.h.f.s.AWSCredentialProviderList - Closing AWSCredentialProviderList[refcount= 0: [SimpleAWSCredentialsProvider] last provider: SimpleAWSCredentialsProvider
2025-08-12 00:14:54,767 [java-sdk-http-connection-reaper] DEBUG c.a.http.IdleConnectionReaper - Reaper thread:
java.lang.InterruptedException: sleep interrupted
at java.base/java.lang.Thread.sleep(Native Method)
at com.amazonaws.http.IdleConnectionReaper.run(IdleConnectionReaper.java:188)
2025-08-12 00:14:54,767 [java-sdk-http-connection-reaper] DEBUG c.a.http.IdleConnectionReaper - Shutting down reaper thread.
2025-08-12 00:14:54,767 [Thread-2] DEBUG o.a.hadoop.util.ShutdownHookManager - Completed shutdown in 0.007 seconds; Timeouts: 0
2025-08-12 00:14:54,768 [files-delete-on-exit] DEBUG c.a.h.c.ClientConnectionManagerFactory -
java.lang.reflect.InvocationTargetException: null
at jdk.internal.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at com.amazonaws.http.conn.ClientConnectionManagerFactory$Handler.invoke(ClientConnectionManagerFactory.java:76)
at com.amazonaws.http.conn.$Proxy35.connect(Unknown Source)
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:393)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at com.amazonaws.http.apache.client.impl.SdkHttpClient.execute(SdkHttpClient.java:72)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1346)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1157)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:814)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:781)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:755)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:715)
at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:697)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:561)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:541)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5558)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5505)
at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1403)
at org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$getObjectMetadata$11(S3AFileSystem.java:2676)
at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:468)
at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:431)
at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2664)
at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2644)
at org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3735)
at org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:3663)
at org.apache.hadoop.fs.s3a.S3AFileSystem.deleteWithoutCloseCheck(S3AFileSystem.java:3265)
at org.apache.hadoop.fs.s3a.S3AFileSystem.delete(S3AFileSystem.java:3239)
at com.dremio.plugins.util.ContainerFileSystem.delete(ContainerFileSystem.java:399)
at com.dremio.exec.hadoop.HadoopFileSystem.delete(HadoopFileSystem.java:408)
at com.dremio.io.file.FileSystemUtils$1.run(FileSystemUtils.java:60)
Caused by: java.io.InterruptedIOException: Connection already shutdown
at org.apache.http.impl.conn.DefaultManagedHttpClientConnection.bind(DefaultManagedHttpClientConnection.java:118)
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:135)
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:376)
... 36 common frames omitted
2025-08-12 00:14:54,768 [files-delete-on-exit] DEBUG o.a.h.i.c.DefaultManagedHttpClientConnection - http-outgoing-40: Shutdown connection
2025-08-12 00:14:54,768 [files-delete-on-exit] DEBUG o.a.h.impl.execchain.MainClientExec - Connection discarded
2025-08-12 00:14:54,768 [files-delete-on-exit] DEBUG o.a.h.i.c.PoolingHttpClientConnectionManager - Connection released: [id: 40][route: {s}->https://<bucket>.s3.us-east-2.amazonaws.com:443][total available: 0; route allocated: 0 of 96; total allocated: 0 of 96]
2025-08-12 00:14:54,768 [files-delete-on-exit] DEBUG com.amazonaws.http.AmazonHttpClient - Unable to execute HTTP request: Connection already shutdown Request will be retried.
2025-08-12 00:14:54,768 [files-delete-on-exit] DEBUG com.amazonaws.request - Retrying Request: HEAD https://<bucket>.s3.us-east-2.amazonaws.com /dremio/uploads/_staging.dremio-admin
2025-08-12 00:14:54,768 [files-delete-on-exit] WARN o.a.h.f.s.AWSCredentialProviderList - Credentials requested after provider list was closed
2025-08-12 00:14:54,769 [files-delete-on-exit] DEBUG com.amazonaws.latency - ServiceName=[Amazon S3], ServiceEndpoint=[https://<bucket>.s3.us-east-2.amazonaws.com], Exception=[java.io.InterruptedIOException: Connection already shutdown, org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: Credentials requested after provider list was closed], RequestType=[GetObjectMetadataRequest], AWSRequestID=[null], HttpClientPoolPendingCount=0, RetryCapacityConsumed=0, HttpClientPoolAvailableCount=0, RequestCount=2, Exception=2, HttpClientPoolLeasedCount=0, ClientExecuteTime=[7.669], HttpRequestTime=[6.842], ApiCallLatency=[7.345], RequestSigningTime=[0.161], CredentialsRequestTime=[0.118, 0.001, 0.073],
2025-08-12 00:14:54,769 [files-delete-on-exit] WARN com.dremio.io.file.FileSystemUtils - Could not delete path <bucket>/dremio/uploads/_staging.dremio-admin
java.nio.file.AccessDeniedException: <bucket>/dremio/uploads/_staging.dremio-admin: org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: Credentials requested after provider list was closed
at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:215)
at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:174)
at org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3760)
at org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:3663)
at org.apache.hadoop.fs.s3a.S3AFileSystem.deleteWithoutCloseCheck(S3AFileSystem.java:3265)
at org.apache.hadoop.fs.s3a.S3AFileSystem.delete(S3AFileSystem.java:3239)
at com.dremio.plugins.util.ContainerFileSystem.delete(ContainerFileSystem.java:399)
at com.dremio.exec.hadoop.HadoopFileSystem.delete(HadoopFileSystem.java:408)
at com.dremio.io.file.FileSystemUtils$1.run(FileSystemUtils.java:60)
Caused by: org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: Credentials requested after provider list was closed
at org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:166)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1269)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1290)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1157)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:814)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:781)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:755)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:715)
at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:697)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:561)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:541)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5558)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5505)
at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1403)
at org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$getObjectMetadata$11(S3AFileSystem.java:2676)
at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:468)
at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:431)
at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2664)
at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2644)
at org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3735)
... 6 common frames omitted
2025-08-12 00:14:54,777 [Thread-2] DEBUG o.a.hadoop.util.ShutdownHookManager - ShutdownHookManager completed shutdown.
A few hundred lines up I also see this:
Restore failed for the 'embedded_pointers' table backup
Restore failed for the 'configuration' table backup
Restore failed for the 'dac-namespace' table backup
Restore failed for the 'catalog-source-data' table backup
2025-08-12 00:14:54,760 [files-delete-on-exit] DEBUG o.apache.hadoop.fs.s3a.S3AFileSystem - Getting path status for s3a://<bucket>/dremio/uploads/_staging.dremio-admin (dremio/uploads/_staging.dremio-admin); needEmptyDirectory=true
2025-08-12 00:14:54,760 [files-delete-on-exit] DEBUG o.apache.hadoop.fs.s3a.S3AFileSystem - S3GetFileStatus s3a://<bucket>/dremio/uploads/_staging.dremio-admin
2025-08-12 00:14:54,761 [files-delete-on-exit] DEBUG o.apache.hadoop.fs.s3a.S3AFileSystem - HEAD dremio/uploads/_staging.dremio-admin with change tracker null
2025-08-12 00:14:54,761 [files-delete-on-exit] DEBUG o.a.h.f.s.audit.impl.LoggingAuditor - [91] fe45a022-ae11-45bf-917f-ad92c9a61231-00000098 Executing op_delete with {action_http_head_request 'dremio/uploads/_staging.dremio-admin' size=0, mutating=false}; https://audit.example.org/hadoop/1/op_delete/fe45a022-ae11-45bf-917f-ad92c9a61231-00000098/?op=op_delete&p1=s3a://<bucket>/dremio/uploads/_staging.dremio-admin&pr=dremio&ps=9d90e23c-0f1d-4997-8f95-5068bfc70344&id=fe45a022-ae11-45bf-917f-ad92c9a61231-00000098&t0=91&fs=fe45a022-ae11-45bf-917f-ad92c9a61231&t1=91&ts=1754957694760
2025-08-12 00:14:54,761 [shutdown-hook-0] DEBUG org.apache.hadoop.fs.FileSystem - FileSystem.close() by method: org.apache.hadoop.fs.FilterFileSystem.close(FilterFileSystem.java:529)); Key: (dremio (auth:SIMPLE))@file://; URI: file:///; Object Identity Hash: 4fda7e81
2025-08-12 00:14:54,761 [files-delete-on-exit] DEBUG com.amazonaws.request - Sending Request: HEAD https://<bucket>.s3.us-east-2.amazonaws.com /dremio/uploads/_staging.dremio-admin
2025-08-12 00:14:54,761 [shutdown-hook-0] DEBUG org.apache.hadoop.fs.FileSystem - FileSystem.close() by method: org.apache.hadoop.fs.RawLocalFileSystem.close(RawLocalFileSystem.java:895)); Key: null; URI: file:///; Object Identity Hash: 411d990a
2025-08-12 00:14:54,761 [files-delete-on-exit] DEBUG com.amazonaws.auth.AWS4Signer - AWS4 Canonical Request: '"HEAD
/dremio/uploads/_staging.dremio-admin
Note that I’ve replaced my bucket name with in logs. I noticed mention of a folder in S3 at path dremio/uploads/_staging.dremio-admin
quite a few times, not sure if that’s important but did check S3 and don’t see a folder with that name at that path.