Airbyte and minio

Hi ,
I am trying to ingest data into iceberg catalog that i host on my local minio . airbyte docs explain how to setup for s3 , tried the same for minio but id did not work. Can you help?
thanks
tolga

@tolgaevren What is the issue? When you try to create an Iceberg table on S3 it fails?

Hi Balaji, no i am not at that point yet . I am creating a destination in airbyte . destination type is iceberg. ( there is a video that explains how to ingest data by using airbyte and dremio but in that video destination is aws cloud s3.) I am using local minio.
This is the error message:
2023-12-13 10:31:00 platform > Docker volume job log path: /tmp/workspace/f1d60308-1347-49e7-84db-b5ef937c92d8/0/logs.log

2023-12-13 10:31:00 platform > Executing worker wrapper. Airbyte version: 0.50.36

2023-12-13 10:31:00 platform > Attempt 0 to save workflow id for cancellation

2023-12-13 10:31:00 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: ‘2.0’

2023-12-13 10:31:00 platform >

2023-12-13 10:31:00 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: ‘2.0’

2023-12-13 10:31:00 platform > ----- START CHECK -----

2023-12-13 10:31:00 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: ‘0.1’

2023-12-13 10:31:00 platform >

2023-12-13 10:31:00 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: ‘0.1’

2023-12-13 10:31:00 platform > Using default value for environment variable LAUNCHDARKLY_KEY: ‘’

2023-12-13 10:31:00 platform > Checking if airbyte/destination-iceberg:0.1.4 exists…

2023-12-13 10:31:00 platform > airbyte/destination-iceberg:0.1.4 was found locally.

2023-12-13 10:31:00 platform > Creating docker container = destination-iceberg-check-f1d60308-1347-49e7-84db-b5ef937c92d8-0-bvxkr with resources io.airbyte.config.ResourceRequirements@3d429800[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null

2023-12-13 10:31:00 platform > Preparing command: docker run --rm --init -i -w /data/f1d60308-1347-49e7-84db-b5ef937c92d8/0 --log-driver none --name destination-iceberg-check-f1d60308-1347-49e7-84db-b5ef937c92d8-0-bvxkr --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/destination-iceberg:0.1.4 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.50.36 -e WORKER_JOB_ID=f1d60308-1347-49e7-84db-b5ef937c92d8 airbyte/destination-iceberg:0.1.4 check --config source_config.json

2023-12-13 10:31:00 platform > Reading messages from protocol version 0.2.0

2023-12-13 10:31:13 platform > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json}

2023-12-13 10:31:13 platform > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.destination.iceberg.IcebergDestination

2023-12-13 10:31:13 platform > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: CHECK

2023-12-13 10:31:13 platform > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=CHECK, configPath=‘source_config.json’, catalogPath=‘null’, statePath=‘null’}

2023-12-13 10:31:14 platform > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword

2023-12-13 10:31:14 platform > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword

these are my definitions in airbyte:( also uploaded the image)

S3 Warehouse Uri for Iceberg
s3a://lakehouse

endpoint:
http://172.20.10.9:9000

i can access by using this ip in minio ui.

@tolgaevren

I suggest reaching out to the Airbyte community at airbytehq/airbyte · Discussions · GitHub or the MinIO community at minio/minio · Discussions · GitHub

Once both airbyte and minio are functioning well together, please feel free to post back for any Dremio related help you need.