Attempting to write a too large value for field with index

I’m getting the following error message when running a query on a csv file in Blob storage

Attempting to write a too large value for field with index 4. Size was 37468 but limit was 32000.

Any ideas?

Hi there!

I have the same error, but with different limit values. I’m testing the windows version of Dremio on my notebook. Anybody has a solution? Thanks!

“Attempting to write a too large value for field with index 45. Size was 65536 but limit was 65536.”

I have the same issue, and its reading from the raw text file with no casting or anything. It seems to break in a really odd place in the file, and I verified the file is valid utf-8.

The really strange thing is that if I take a sample of the file around the byte region where it complains and import it, there is no problem.

I solved this issue finally and here is what the cause was:

  1. The error from dremio returned a particular byte offset. OFFSET
  2. I tried to find the line causing the issue by running:
    head -c+OFFSET FILE | wc -l
  3. This returned a LINE_NO count that I then found the line by doing:
    cat FILE | sed -n ‘1p;LINE_NOp;’
  4. Turns out that this returned a LINE_NO that was off by 460 lines in my case.
  5. The actualy line that was causing the issue had a character that turns out was defined as my quote character (in this case a "). This was causing the issue, however it was very hard for me to find since the byte offset appeared to be wrong. I had to basically filter and shift on the file and reimport it to dremio several times.
  6. Disabling the quote character by setting it to a control character solved the issue.

Can someone verify that my method of finding the errant line is flawed or is the byte offset being returned by dremio in the error incorrect?

I have the same issue from database SQL Server.

**

Attempting to read a too large value for field with index 2. Size was 32219 but limit was 32000.

**

  UNSUPPORTED_OPERATION ERROR: Attempting to read a too large value for field with index 2. Size was 32219 but limit was 32000.

fieldIndex 2
size 32219
limit 32000
SqlOperatorImpl JDBC_SUB_SCAN
Location 1:0:6
Fragment 1:0

[Error Id: 5d63a69e-7db9-4602-b95b-b2ab71377163 on localhost:31010

Any Suggestion to fix that limitation?