Added a S3 bucket as data source, it had a folder full of gzip files containing JSON data.
As I was trying to create a VDS with one of these folders (size: 2.1 GiB, 257 objects), the preview had no errors, but when I pressed save, this error occoured.
Attempting to read a too large value for field with name description. Size was 43341 but limit was 32000.
Description is the name of a column in the dataset. It contains some html snippets from a scraping.
So, is one of the strings in the rows too large for the datatype being assigned? That is my guess, but i don’t know
How would I fix this?