We attempted an upgrade to v12.0.0 (in the past we attempted it to an older version with this same result) and we cannot create reflections on at least one of our PDS (HDFS, parquet files) due to the “Field exceeds the size limit of 32000 bytes” error (it doesn’t specify the field).
Now, of what I know, the limit existed in v4.1.8 too, with the same value. If the error would specify at least the field, we could check it - to see if it’s a false-positive or not. But the fact that it’s working fine with 4.1.8 is really strange - can you tell which field is the issue from the profile?
@arisro Currently we do not, best is to talk to data enrichment team and see if they know which columns. I totally agree with you that the column name exceeding the the field width limit needs to be displayed. We have a an internal ticket that I will try to prioritize soon
Ok, thank you - I’m not even sure it has such a long value on a column - but we’ll check.
But can you confirm it’s about the value of a column from the profile and not about something else?
What is really weird is that the dataset is pretty big, several Ts (10blns of rows), and on v4.1.8 we never hit that 32k limit error - and on v12.0.0 we hit it only after 500k rows are scanned or so).
I think in 4.1.8, I remember people reporting that the limit was not enforced correctly, it probably got fixed somewhere after that. Yes, it is the value of the columns