then just type “docker-compose up” and it should mount both images.
then create an account in Dremio (you can access through “localhost:9047” from your host
and try to add an External Source
Why do you want to use docker-compose ?
It’s a good tool to run multiples containers in local, I could run a helm chart in KIND , but I prefer to keep it simple at start.
Also I would like to run a real cluster, a Coordinator node and some Executor nodes , all in separate container launch by docker-compose
@YEN provided a great tutorial on the quick docker run approach, basic functionality.
Can you explain what you hope to achieve from Docker Compose? Do you wish to simulate one or more of Dremio’s production architecture(s)? Do you want to tweak or customize the Dremio front end from dremio-oss?
The main purpose of running in a docker version of Dreimo for me is a local development - test environment. My main Dremio runs on AWS. I want to test or validate a few queries and test changing some settings that I know Dremio uses both on AWS and in Dremio. I want to see whether my laptop is faster at refreshing reflections all locally and see whether it may be more cost effective to run Dremio from a custom docker image or docker compose that includes another Oracle or SQL Server config.
I deployed Oracle enterprise 19c on Docker then used it as a data source from Dremio-OSS for example.
If your interest is more so on the multi node approach or have a hosted Docker enterprise environment that can leverage scalability or run as Kubernetes, running a dremio-oss image and a few data sources won’t be enough.
If you don’t plan to use the AWS or Azure versions of Dremio, you need to consider the complexity of running Hadoop or MapR as a pre-requisite to enabling your docker compose components a.k.a. engines or clusters or swarms. Most of those multi-node deployments are so complex other companies provide custom images and managed services around those more open solutions.
If you want to simulate the self-hosted Hadoop or MapR environment of Dremio then I could see some value in a docker compose, just remember the bulk of the work will not be Dremio specific. Instead, you’d need a suite of Hadoop docker images configurations that Dremio would fit well with, perhaps a few cloud images to integrate with cloud storage layer or even a managed hybrid cloud YARN cluster that could share both cloud and local nodes to run Dremio queries.
Dremio on YARN Hadoop I think could be fully open source in a docker compose, that would really benefit the community to leverage a pre-built “test suite” of images for simulation or testing multi node Hadoop engine with Dremio as a query engine and front-end.