This is a reflection on a virtual query with no parent physical query reflection yet ran.
I think I have seen somewhere in the community where you had to run the query from the parent first to expose the schema. However, this is very much not possible with large data sets (10 million records).
Can anyone explain this error?
DATA_WRITE ERROR: Failure while attempting to spill sort data to disk.
SqlOperatorImpl EXTERNAL_SORT
Location 0:0:8
Fragment 0:0
[Error Id: 34511422-f1e2-4d67-a4b6-37a405f21cd5 on ip-172-31-34-69.ec2.internal:31010]
(java.lang.NullPointerException) Container schema is not set. Either schema was not built, or schema was cleared.
com.google.common.base.Preconditions.checkNotNull():226
com.dremio.exec.record.VectorContainer.getSchema():315
com.dremio.sabot.op.sort.external.DiskRunManager.spill():420
com.dremio.sabot.op.sort.external.MemoryRun.closeToDisk():362
com.dremio.sabot.op.sort.external.ExternalSortOperator.rotateRuns():283
com.dremio.sabot.op.sort.external.ExternalSortOperator.consumeData():172
com.dremio.sabot.driver.SmartOp$SmartSingleInput.consumeData():235
com.dremio.sabot.driver.StraightPipe.pump():59
com.dremio.sabot.driver.Pipeline.doPump():82
com.dremio.sabot.driver.Pipeline.pumpOnce():72
com.dremio.sabot.exec.fragment.FragmentExecutor$DoAsPumper.run():288
com.dremio.sabot.exec.fragment.FragmentExecutor$DoAsPumper.run():284
java.security.AccessController.doPrivileged():-2
javax.security.auth.Subject.doAs():422
org.apache.hadoop.security.UserGroupInformation.doAs():1807
com.dremio.sabot.exec.fragment.FragmentExecutor.run():243
com.dremio.sabot.exec.fragment.FragmentExecutor.access$800():83
com.dremio.sabot.exec.fragment.FragmentExecutor$AsyncTaskImpl.run():577
com.dremio.sabot.task.AsyncTaskWrapper.run():92
com.dremio.sabot.task.slicing.SlicingThread.run():71