How to handle this 32k size issue?
We have 2 columns in our source table with more than 32k size and we need both of them.
Is there a work around how can we handle this at dremio either from UI or from pyspark by setting the limit to 90k or so ?
That is the currently supported limit. Dremio team is working on enhancing the error message so it can point to the column that is past 32K
Thanks
Bali
Thanks @balaji.ramaswamy,
One more question -
We normally work with many (hive/impala)queries using .hql file and these hql files are called using spark /shell programs.
So need help on how the same process can be handled using dremio ?
Dremio does not understand hql, only sql. You should be able to run the HQL query in Dremio and save as VDS. Some HQL specific semantics may not work
Thanks
Bali