Attempting to read a too large value for field with name description

Added a S3 bucket as data source, it had a folder full of gzip files containing JSON data.

As I was trying to create a VDS with one of these folders (size: 2.1 GiB, 257 objects), the preview had no errors, but when I pressed save, this error occoured.

Attempting to read a too large value for field with name description. Size was 43341 but limit was 32000.

Description is the name of a column in the dataset. It contains some html snippets from a scraping.
So, is one of the strings in the rows too large for the datatype being assigned? That is my guess, but i don’t know
How would I fix this?

1 Like

I am also getting similar Error when i have migrated to dremio 3.2 ,but same query was worked fine with dremio 3.1 ,is any Restriction on Number of column …

i got this issue when creating reflection on VDS

1 Like

@Vikash_Singh and @tonio, in our recent releases, we’ve introduced “guard rails” to Dremio, where we put a limit on particular system parameters. This is one such limit.

If you stay within these limits, you have a stronger guarantee of system stability (these are the limits within which we’ve tested Dremio). It’s possible to tune these, but you do so at your own risk.

1 Like

Thanks yes, its better to follow limits ,i am modifing column field to not cross 32 K byte Data

Also where i can find list of limit so that before asked question I should know what i am doing

They will be reported (like in this example) when you encounter them.

Their names are relatively descriptive, but we do not currently have a published list of all of them with notes on their meaning.

@ben how would i tune such parameters?
Are those parameters on the dremio-env config file?
Is it documented somewhere?

1 Like

The parameter is limits.single_field_size_bytes and can be changed in Admin->Support->Support Keys
But generally it’s not a good idea to change this drastically. We may remove the ability to change these limits in the near future.

Hi,
We are getting the same issue and were able to fix it by fine tuning the limits single_field_size_bytes parameter.
The error happen as soon as you try to browse the source (not talking about the VDS or reflection refresh).
Though I’m not comfortable with the workaround as you mention that it’s not a good idea to increase it.
The fact is that the source is an elasticsearch indices that contains documents attachments. Obviously, most of these document exceed the 32000 (bytes I suppose).
Our application accept to store documents up to 30MB.
I’m not sure, what the right solution should be to fix the issue without changing the limits.single_field_size_bytes parameter?

Cheers,