KTable connector?

Hello,
is it planned to use Kafka KTable as source for dremio? It could be great for building an IOT plaform with Kafka and Dremio where Kafka is used as a streaming datastore and Dremio as a query engine and data governance tool. No more need of hadoop then.

Kafka is currently not supported as a source but is on our roadmap.
The best option right now is to output the stream to a file system like HDFS/S3/etc then access that as a source.

thanks for the answer ; the solution given is what we would like to avoid actually, because we need a second cluster for storing data, when we have already a datastore with kafka.

Other wise, is a druid connector planned?

Druid as a source is something we are watching to guage community interest. It is not currently scheduled.

Hopefully a little later this year we will have an SDK so people can start to build their own connectors and contribute those back to the project when that makes sense.

Nice…

on another subject, currently the means to access dremio by spark is jdbc… could not be a specific connector?

That’s correct, JDBC is the best option right now probably.

We have so many things we want to build, and we are hard at work to get them released as quickly as possible. Unfortunately, there are only so many hours in the day. :slight_smile:

don’t worry :slight_smile:
it was just a question to know if it was the best way intrinsequally

regards