Hi, getting the following error with a new setup when I try to run a query
Setup
AKS, 3 nodes, Azure Storage Gen 2
Cloud Cache enabled
all running under dremio user via ps -ef | grep dremo
Storage Container
ERROR
SYSTEM ERROR: NativeIOException: Operation not permitted
SqlOperatorImpl ARROW_WRITER
Location 0:0:3
SqlOperatorImpl ARROW_WRITER
Location 0:0:3
Fragment 0:0
[Error Id: a2b66576-8fca-407f-9df4-89aebbd816b5 on dremio-executor-0.dremio-cluster-pod.bd-dremio.svc.cluster.local:0]
(org.apache.hadoop.io.nativeio.NativeIOException) Operation not permitted
org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl():-2
org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod():234
org.apache.hadoop.fs.RawLocalFileSystem.setPermission():861
org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode():547
org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission():587
org.apache.hadoop.fs.RawLocalFileSystem.mkdirs():559
org.apache.hadoop.fs.RawLocalFileSystem.create():316
org.apache.hadoop.fs.RawLocalFileSystem.create():351
com.dremio.exec.store.dfs.PseudoDistributedFileSystem.create():369
org.apache.hadoop.fs.FileSystem.create():1118
org.apache.hadoop.fs.FileSystem.create():1098
org.apache.hadoop.fs.FileSystem.create():987
org.apache.hadoop.fs.FileSystem.create():975
com.dremio.exec.hadoop.HadoopFileSystem.create():212
com.dremio.exec.store.easy.arrow.ArrowRecordWriter.setup():98
com.dremio.sabot.op.writer.WriterOperator.setup():107
com.dremio.sabot.driver.SmartOp$SmartSingleInput.setup():255
com.dremio.sabot.driver.Pipe$SetupVisitor.visitSingleInput():73
com.dremio.sabot.driver.Pipe$SetupVisitor.visitSingleInput():63
com.dremio.sabot.driver.SmartOp$SmartSingleInput.accept():200
com.dremio.sabot.driver.StraightPipe.setup():103
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.Pipeline.setup():68
com.dremio.sabot.exec.fragment.FragmentExecutor.setupExecution():391
com.dremio.sabot.exec.fragment.FragmentExecutor.run():273
com.dremio.sabot.exec.fragment.FragmentExecutor.access$1400():94
com.dremio.sabot.exec.fragment.FragmentExecutor$AsyncTaskImpl.run():711
com.dremio.sabot.task.AsyncTaskWrapper.run():112
com.dremio.sabot.task.slicing.SlicingThread.mainExecutionLoop():228
com.dremio.sabot.task.slicing.SlicingThread.run():159
@rwetzeler
Is this AWSE? Results go to local. Is there permission to write results to local?
Thanks
Bali
This is running in Azure Kubernetes (AKS). Permissions appear to work when I upload a file as it works. But when I query that file, I get the error above.
@rwetzeler
That is right, Dremio is unable to write results to that folder, see stack below
org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl():-2
org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod():234
org.apache.hadoop.fs.RawLocalFileSystem.setPermission():861
srihari
September 14, 2022, 11:56am
5
@balaji.ramaswamy @rwetzeler
We are facing same NativeIOException: Operation not permitted issue,
Fyi ,We have installed dremio on AKS cluster via Helm charts and in Values.yaml we have enabled dist_storage to azureStorage
Could you please let us know where dremio is unable to write the results,is it on AKS PV or is it on dist storage (Azure storage )
Here is the error log :
SYSTEM ERROR: NativeIOException: Operation not permitted
SqlOperatorImpl ARROW_WRITER
Location 1:3:1
SqlOperatorImpl ARROW_WRITER
Location 1:3:1
Fragment 1:0
[Error Id: 0cb175ce-fa26-4d9b-bac4-fa9b2c05d6d8 on dremio-executor-1.dremio-cluster-pod.dremio.svc.cluster.local:0]
(org.apache.hadoop.io.nativeio.NativeIOException) Operation not permitted
org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl():-2
org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod():382
org.apache.hadoop.fs.RawLocalFileSystem.setPermission():974
org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode():660
org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission():700
org.apache.hadoop.fs.RawLocalFileSystem.mkdirs():672
org.apache.hadoop.fs.RawLocalFileSystem.create():424
org.apache.hadoop.fs.RawLocalFileSystem.create():459
com.dremio.exec.store.dfs.PseudoDistributedFileSystem.create():371
org.apache.hadoop.fs.FileSystem.create():1195
org.apache.hadoop.fs.FileSystem.create():1175
org.apache.hadoop.fs.FileSystem.create():1064
org.apache.hadoop.fs.FileSystem.create():1052
com.dremio.exec.hadoop.HadoopFileSystem.create():226
com.dremio.exec.store.easy.arrow.ArrowRecordWriter.setup():111
com.dremio.sabot.op.writer.WriterOperator.setup():118
com.dremio.sabot.driver.SmartOp$SmartSingleInput.setup():261
com.dremio.sabot.driver.Pipe$SetupVisitor.visitSingleInput():73
com.dremio.sabot.driver.Pipe$SetupVisitor.visitSingleInput():63
com.dremio.sabot.driver.SmartOp$SmartSingleInput.accept():206
com.dremio.sabot.driver.StraightPipe.setup():103
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.Pipeline.setup():69
com.dremio.sabot.exec.fragment.FragmentExecutor.setupExecution():478
com.dremio.sabot.exec.fragment.FragmentExecutor.run():327
com.dremio.sabot.exec.fragment.FragmentExecutor.access$1600():97
com.dremio.sabot.exec.fragment.FragmentExecutor$AsyncTaskImpl.run():820
com.dremio.sabot.task.AsyncTaskWrapper.run():120
com.dremio.sabot.task.slicing.SlicingThread.mainExecutionLoop():247
com.dremio.sabot.task.slicing.SlicingThread.run():171
@srihari In your values.yaml, you should find a section where results are directed to a local disk. Where is that?
distStorage:
type: “azureStorage”
We have chosen azureStorage and provided access keys of it
@balaji.ramaswamy Could you please let us know how to troubleshoot below issue
Here is the error log :
SYSTEM ERROR: NativeIOException: Operation not permitted
SqlOperatorImpl ARROW_WRITER
Location 1:3:1
SqlOperatorImpl ARROW_WRITER
Location 1:3:1
Fragment 1:0
[Error Id: 0cb175ce-fa26-4d9b-bac4-fa9b2c05d6d8 on dremio-executor-1.dremio-cluster-pod.dremio.svc.cluster.local:0]
(org.apache.hadoop.io.nativeio.NativeIOException) Operation not permitted
org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl():-2
org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod():382
org.apache.hadoop.fs.RawLocalFileSystem.setPermission():974
org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode():660
org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission():700
org.apache.hadoop.fs.RawLocalFileSystem.mkdirs():672
org.apache.hadoop.fs.RawLocalFileSystem.create():424
org.apache.hadoop.fs.RawLocalFileSystem.create():459
com.dremio.exec.store.dfs.PseudoDistributedFileSystem.create():371
org.apache.hadoop.fs.FileSystem.create():1195
org.apache.hadoop.fs.FileSystem.create():1175
org.apache.hadoop.fs.FileSystem.create():1064
org.apache.hadoop.fs.FileSystem.create():1052
com.dremio.exec.hadoop.HadoopFileSystem.create():226
com.dremio.exec.store.easy.arrow.ArrowRecordWriter.setup():111
com.dremio.sabot.op.writer.WriterOperator.setup():118
com.dremio.sabot.driver.SmartOp$SmartSingleInput.setup():261
com.dremio.sabot.driver.Pipe$SetupVisitor.visitSingleInput():73
com.dremio.sabot.driver.Pipe$SetupVisitor.visitSingleInput():63
com.dremio.sabot.driver.SmartOp$SmartSingleInput.accept():206
com.dremio.sabot.driver.StraightPipe.setup():103
com.dremio.sabot.driver.StraightPipe.setup():102
com.dremio.sabot.driver.Pipeline.setup():69
com.dremio.sabot.exec.fragment.FragmentExecutor.setupExecution():478
com.dremio.sabot.exec.fragment.FragmentExecutor.run():327
com.dremio.sabot.exec.fragment.FragmentExecutor.access$1600():97
com.dremio.sabot.exec.fragment.FragmentExecutor$AsyncTaskImpl.run():820
com.dremio.sabot.task.AsyncTaskWrapper.run():120
com.dremio.sabot.task.slicing.SlicingThread.mainExecutionLoop():247
com.dremio.sabot.task.slicing.SlicingThread.run():171
srihari
October 27, 2022, 10:17am
9
@balaji.ramaswamy Could you please help me on this
@srihari Do other dist objects like reflections get created successfully? If no, then we need to see if the Dremio service user us able to read/write to those folders. You can validate that by reading/writing from the msater/executor pod
If reflections are getting created then you can move results to local and retry. Would you be able to upload your values.yaml (taking put sensitive data) and also upload the job profile zip file?