Hi, wanted to ask how to shorten a time between creating of new table in source and being able to query it in dremio.
For example, SF connector (ARP) does not allow to define discover refresh rate, and it takes quite long time before it is available for querying.
Additionally, can the discovery be triggered manually?
Lastly, the best option would be to discover the table every time someone tries to query it but dremio does not know about it yet.
@Rakesh_Malugu
Hi! That would be awesome. I tried it, but get different error. I dont understand why… The SF dataset definitely exists. I can run query like select * from SF."METRICS3_STG".PUBLIC.TEST; fine, but this fails: alter PDS SF refresh metadata;
@Rakesh_Malugu I can run that on already discovered table. But if I run it on table that dremio does not know about yet, it fails with the exception I posted.
I’m not sure what your underlying datasource is, I have only created datasets on parquet files.
An example request body to that API would look like this
{“path”: [“employee”], “entityType”: “dataset”, “type”: “PHYSICAL_DATASET”, “id”: “employee”, “format”: {“type”: “Parquet”}}
You will get the same error when you try to run the query on a non-existing table.
If you have already a PDS in the newly created table location, then I would recommend you copy the PDS path and just change the table name to newly created one and run the alter PDS command.
@Rakesh_Malugu
I’m not sure you understand my issue. Wehn I create new table in Snowflake, Dremio does know about it. When I run the ALTER PDS statement, Dremio says that table does not exist, even though the table exists in Snowflake already. Dremio is not able to discover it, until periodical Discover names refresh is ran How can I tell Dremio that table X exists in Snowflake and to register it as PDS?