Validation error for custom arp connector

I have created a custom ARP connector for Firebird. I have deployed it to my Dremio instance. When I hit save, the UI gives me the following error:

But it is not clear to me what value it deems invalid. Where can I find more information?

You can check out my code at GitHub - serraict/dremio-firebird-connector: A Dremio ARP driven connector that supports Firebird

please enable log level to trace or debug for ARP and share the log

Thank you, will do.

I am stil learning how this system works, so please bear with me.
ATM I only get the server.out log in /var/log/dremio so i think I am missing some high level logging setting to enable logging. (queries.json is not logged either)
I will post back here when I have more information.

@dacopan I can’t get any errors to turn up in the log. Could it be that this is a validation leftover from the SQLite ARP connector template?

please enable more trace debug. To debug pushdowns for queries set the following line in logback.xml

  <logger name="">
    <level value="${dremio.log.level:-trace}"/>

and check error when save, also share your logs
I’ll download and build your connector to test but give some time

also share your setup: java version, dremio versiont, OS version etc

1 Like

Thank you for showing interest and helping out. I have added the logger and found the log information I was looking for:

2024-03-29 15:25:44,384 [main] DEBUG - createDialect called
2024-03-29 15:25:44,421 [main] ERROR c.d.e.s.jdbc.conf.AbstractArpConf - Error creating dialect from ARP file arp/implementation/firebird-arp.yaml.
com.fasterxml.jackson.databind.exc.ValueInstantiationException: Cannot construct instance of ``, problem: Invalid Dremio typename specified in ARP file: 'char'.
 at [Source: (StringReader); line: 55, column: 5] (through reference chain:["data_types"]->["mappings"]->java.util.ArrayList[3])

This static code is called earlier than I expected and got lost in other log messages. And I am unfamiliar with the logging concepts used, so I had to learn how to use logback and docker logs.

My current setup basically uses the docker compose setup from this post. I have spent some time on finding the log files, but in the docker containers they are not available for some reasons. So I resorted to docker logs -f dremio | grep ".jdbc.".

I will now proceed by removing all types from the mapping file, and then systematically adding the ones I need.

happy to help, I suggest you use gradle and base yourself on the snowflake connector or my db2 connector, which are more complete than the mapping that the one you are using has

OK, removing all but the integer mapping results in a successfully created ArpDialect.

But when
I enter abc for the name and I hit Save three times, I get the same empty error in the UI, and nothing in the logs except for: - - [29/Mar/2024:16:09:46 +0000] "GET /static/icons/dremio/sources/FIREBIRD.svg HTTP/1.1" 404 360 "http://dremio.localhost/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/ Safari/537.36" - - [29/Mar/2024:16:09:51 +0000] "PUT /apiv2/source/abc/?nocache=1711728591608 HTTP/1.1" 400 62 "http://dremio.localhost/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/ Safari/537.36" - - [29/Mar/2024:16:10:00 +0000] "PUT /apiv2/source/abc/?nocache=1711728600228 HTTP/1.1" 400 62 "http://dremio.localhost/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/ Safari/537.36" - - [29/Mar/2024:16:10:01 +0000] "PUT /apiv2/source/abc/?nocache=1711728601365 HTTP/1.1" 400 62 "http://dremio.localhost/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/ Safari/537.36"

I think taht the problem is you wrong versions match please share your setup
java version, dremio versiont, OS version etc

Dev laptop: macbook apple m1 pro, OSX Sonoma 14.4

➜  ~ java --version
openjdk 21.0.2 2024-01-16 LTS
OpenJDK Runtime Environment Temurin-21.0.2+13 (build 21.0.2+13-LTS)
OpenJDK 64-Bit Server VM Temurin-21.0.2+13 (build 21.0.2+13-LTS, mixed mode)
➜  ~ mvn --version
Apache Maven 3.9.6 (bc0240f3c744dd6b6ec2920b3cd08dcc295161ae)
Maven home: /opt/homebrew/Cellar/maven/3.9.6/libexec
Java version: 21.0.1, vendor: Homebrew, runtime: /opt/homebrew/Cellar/openjdk/21.0.1/libexec/openjdk.jdk/Contents/Home
Default locale: en_NL, platform encoding: UTF-8
OS name: "mac os x", version: "14.4", arch: "aarch64", family: "mac"

I compile the jar on my mac and then deploy to the latest docker dremio image using docker cp,
see the makefile for details.

Build Information
Community Edition
Build Time
24/01/2024 18:31:10
Change Hash
Change Time
24/01/2024 18:11:23

let me try tonigth, please for minus time attach in release of your repository your jar build of your connector

Im working on optimizing the test workflow right now.

So that a test database can be easily setup, and that the github build is faster. (downloading of all dependencies taks ~10 minutes each run).

The jar can be downloaded from the release page:

I am struggling with the dependency cache; for some reason the github action refuses to retrieve the cache. So the build still takes too long to run.

hello, sorry by delay, I was able to reproduce the error, it was any in your Conf in your connector, let me fix, also I recommend change to gradle let me do it also, I’ll tell you when fix it

1 Like

Thank you, appreciated. I am open to any suggestions and eager to learn.

How did you catch the error? Did it show in the log somewhere?

debugin a local instance of Dremio I see that in ParamConverters class jersey cannot deserialice the request and is because you are set @SourceType(value = "firebird" in lowercase and dremio need in uppercase, also the correct constant driver is

 private static final String DRIVER = "org.firebirdsql.jdbc.FBDriver";

I cahnge to gradle for best check my PR in github

1 Like

Thank you for your help @dacopan , this issue is now resolved and I can connect to my Firebird test database.

you are welcome. I recommend you that review snowflake connector and my db2 connector both have many years in production.
while you test put and review in trace or debug log so toy can show in logs all failed pushdowns to database because you can see that query run success but if you review logs you can check that some operators, functions etc are no pushed down causing bad performance on run queires and stress your original dstasource
Also review the job profile in pisical plan you can view the final native sql pushed down to database

1 Like