Hive metastore
has already be supported in many bigdata products and it now can be split from hadoop as a standalone service on cloud native.
Dremio can read Hive metastore even if it is outside of Hadoop. Is that your question? As long as Dremio is able to talk to the metastore host and get locations of files on HDFS, it should work
Thanks
Bali
Currently Dremio holds metadata in a directory of disk.
Can I configure the custom Dremio metadata storage using hive metastore
?
Dremio’s metadata on RocksDB and Hive metastore are 2 entirely different things, cannot be clubbed
Is there a way to register in the Hive Metastore
a dataset served by Dremio
(Apache Arrow Flight
) ?
In such an architecture Dremio
is not the datalake engine
but a part of a bigger data mesh
.
Is that what you meant koolay ?
Is that making sense?
Yes, it’s another very usefull sense.
Dremio’s metastore requires a mounted disk for distribution.
Dremio will be stateless if that it’s metastore is backend in hive metastore.