I want to connect Azure Databricks with Dremio and pull data, please suggest the best way to connect these systems together so that I can write analytical queries and pull data from dremio?
Secondly, a basic question: does dremio store any data with itself or just indexes the metadata and actually data resides only in sources like Azure storage or mysql database ?
#1 Currently we do not connect to Data Bricks as a source
#2 If you create reflections on Azure datasets, we then make a copy based on the reflection definition. You can define where you want to store these reflections. If you are using Azure storage then that would be a good place to store them
#3 We store metadata on the Dremio coordinator
I was interested to pull dremio data from azure databricks. I have achieved that by using jdbc dremio driver in spark cluster in azure databricks and I used Scala code.
So only if we make changes to datasource (reflection) then dremio stores a copy of data otherwise it just stores meta data , Is my understanding correct.?
Hey @hghub , I as well is trying to pull dremio data from databricks notebook. For me its odbc, and it works fine in my local system as I have installed dremio-connect their. How can I install an exe in databricks workspace? Please help me set this up.