We are currently using Dremio version 24.2.2 on AKS (Kubernetes version 1.27.3). In our deployment, we followed the instructions outlined in the Dremio documentation(Configuring Distributed Storage | Dremio Documentation) to set up distributed storage with ADLS. We attempted both access key and OAuth2 for storage authentication, but encountered issues with both methods. The detailed error message is provided below.
IO_EXCEPTION ERROR: Unable to load key provider class.
SqlOperatorImpl ICEBERG_SUB_SCAN
Location 0:0:10
SqlOperatorImpl ICEBERG_SUB_SCAN
Location 0:0:10
Fragment 0:0
[Error Id: afd0c0cb-5077-4fa9-b5f4-85456e10917f on dremio-executor-dremio-qasprint-1-0.dremio-cluster-pod-dremio-qasprint-1.qa.svc.cluster.local:0]
(org.apache.hadoop.fs.azurebfs.contracts.exceptions.TokenAccessProviderException) Unable to load key provider class.
org.apache.hadoop.fs.azurebfs.AbfsConfiguration.getTokenProvider():526
org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.initializeClient():878
org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.<init>():151
org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.initialize():106
org.apache.hadoop.fs.FileSystem.createFileSystem():3469
org.apache.hadoop.fs.FileSystem.get():537
com.dremio.exec.store.dfs.DremioFileSystemCache.get():69
com.dremio.plugins.azure.AzureStorageFileSystem$ContainerCreatorImpl$FileSystemSupplierImpl.create():266
com.dremio.plugins.util.ContainerFileSystem$FileSystemSupplier.get():245
com.dremio.plugins.util.ContainerFileSystem$ContainerHolder.fs():203
com.dremio.plugins.util.ContainerFileSystem.getFileStatus():493
com.dremio.exec.hadoop.HadoopFileSystem.getFileAttributes():258
com.dremio.exec.store.hive.exec.DremioFileSystem.getFileStatus():365
com.dremio.exec.store.hive.exec.dfs.DremioHadoopFileSystemWrapper.getFileAttributes():239
com.dremio.io.file.FilterFileSystem.getFileAttributes():77
com.dremio.exec.store.dfs.LoggedFileSystem.getFileAttributes():113
com.dremio.exec.store.iceberg.DremioFileIO.newInputFile():78
org.apache.iceberg.TableMetadataParser.read():266
com.dremio.exec.store.iceberg.IcebergUtils.loadTableMetadata():1331
com.dremio.exec.store.iceberg.IcebergManifestListRecordReader.setup():136
com.dremio.sabot.op.scan.ScanOperator.setupReaderAsCorrectUser():348
com.dremio.sabot.op.scan.ScanOperator.setupReader():339
com.dremio.sabot.op.scan.ScanOperator.setup():303
com.dremio.sabot.driver.SmartOp$SmartProducer.setup():595
com.dremio.sabot.driver.Pipe$SetupVisitor.visitProducer():80
com.dremio.sabot.driver.Pipe$SetupVisitor.visitProducer():64
com.dremio.sabot.driver.SmartOp$SmartProducer.accept():565
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.Pipeline.setup():71
com.dremio.sabot.exec.fragment.FragmentExecutor.setupExecution():621
com.dremio.sabot.exec.fragment.FragmentExecutor.run():443
com.dremio.sabot.exec.fragment.FragmentExecutor.access$1700():108
com.dremio.sabot.exec.fragment.FragmentExecutor$AsyncTaskImpl.run():1007
com.dremio.sabot.task.AsyncTaskWrapper.run():122
com.dremio.sabot.task.slicing.SlicingThread.mainExecutionLoop():249
com.dremio.sabot.task.slicing.SlicingThread.run():171
Interestingly, when we tested the same configuration with version 24.1, we did not encounter any problems.
Any insights, thoughts, or suggestions regarding this issue would be highly appreciated.
Thanks,
Ratheesh