I’m trying to connect to my psotgresql database through dremio’s jdbc driver and sparksql. I have set up everything to what I believe is the correct configuration through spark but I keep getting the following error:
Intermediate Configuration and Plugin files, in order of precedence:
- jar:file:/home/shayan/spark-2.1.1-bin-hadoop2.7/dremio-jdbc-driver-1.0.8-201707190805180330-27f36e1.jar!/sabot-module.conf
17/07/20 18:18:23 ERROR DremioJdbc41Factory: Failed to create prepared statement: VALIDATION ERROR: Table ‘schema1.table1’ not found `
A few things:
The table is definitely there as I can see it in my dremio ui and it is accessible through there
It seems that sparksql isn’t establishing a connection through dremio as it doesn’t matter what the database name is at the end of the jdbc url (jdbc:dremio:direct=127.0.1.1:31010:testData) as I always get the same error.
It seems like you are talking to Dremio through sparkSQL, which is configured to connect to Postgresql. Is that correct?
It would probably be best to check that Dremio is configured properly to talk to Postgres. In the Dremio UI, are you able to browse the postgres tables and run queries against the postgres tables?
Is it correct to assume that “schema1” is the name of the schema in the postgres database?
If so, when querying through dremio, you would need to include the name of the postgres source in the table path. Is “testData” the name of the postgres source in Dremio? If you want to include the default database in the jdbc connection string, I believe it needs to be separated with semicolon: “jdbc:dremio:direct=127.0.1.1:31010;schema=testData”. Alternatively, you can leave the schema part out, and query the table “testData.schema1.table1”.