In a lot of my workflows, I am filtering data from a variety of local or remote sources using R locally and then I would like to upload the result set to my instance of Dremio to perform a join with a table on HDFS.
For instance, I may grab partner information from salesforce databases and then filter it down to 10K partner IDs that I’d like to get data for from the HDFS table.
I would normally hope to accomplish this by pumping the dealer IDs into a temp table in Dremio and then having Dremio perform a join on it in a subsequent SQL statement. I know I can write these IDs into a CSV and then upload it manually, but this hampers repeat-ability and I’d really like to avoid the manual steps.
The other solution would be (depending on size) to add the partner IDs to (where partner_id in (“x”…) ) the actual SWL statement being run, which would create the longest statement ever.
It also looks like there is a way to create tables via sql https://docs.dremio.com/sql-reference/sql-commands/tables.html so the concept exists, but RODBC’s sqlSave seems to fail with:
Query execution error. Details: PARSE ERROR: Failure parsing the query.
How can I do this?