Error to unnest a json column and extract values

Hey, I’m facing difficulties to unnest and extract values from my dataset which has json columns as varchar.
I know that the first thing I have to do is to convert the varchar to json with CONVERT_FROM(‘varchar’ , ‘JSON’) and then exctract the values that I want with “SELECT table.column.key1 FROM table_converted_with_jsons”. Am I right?

So, here’s the error in step 1 with the following query:

→ “select convert_from(request, ‘json’) from PaymentAudits limit 1”
→ (It takes some seconds and then shows the message: “New field in the schema found. Please reattempt the query. Multiple attempts may be necessary to fully learn the schema.”

I reattempted many times and I put just 1 row to test if the problem was with inconsistency of the data in the same column through the rows, but apparently the problem isn’t it and we aren’t changing the schema untill I know.

And this is the query profile:
08c95232-a76d-426e-93be-21bfe562c8fc.zip (104,5,KB)

Does anyone know what it is happening?

Thanks in advance!

My version:
Build
22.1.1-202208230402290397-a7010f28
Edition
AWS Edition
Build Time
23/08/2022 04:11:32
Change Hash
a7010f284dfb7bf28a29fb72846572e6991972d7
Change Time
23/08/2022 03:53:28

@Natalia Are you able to share the 1 row JSON file?

Sure!
This is how Dremio reads the dataset from a postgre table:


It understands the ‘request’ column as a string column.
1c43eb6f-b464-6748-f553-e487aeac4200.json.zip (2,2,KB)

@balaji.ramaswamy Is it working with you?

@balaji.ramaswamy the query tries 10 attempst and then give this error message. We solutioned creating a materialized table on S3 from this postgre table and then convert the column to json. Now it worked but I don’t know if it’s the best way.

Works for me using the file you gave @Natalia

Can you disable the reflection on the below VDS and try?

Raw.clinappproduction.v_raw_payments_audits