The Environment setup as below:
- Dremio v 2.4.1
- Hive standalone metastore as Iceberg catalog. Hive Version is 2.x
- Catalog and data stored at on-prem s3 Storage.
- No hadoop or HDFS is installed .
The table is successfully created, the Hive metastore discovered successfully, but while trying to select from a table it fails with the following error:
"com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: java.io.InterruptedIOException: doesBucketExist on dwh-test: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider SharedInstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: Failed to connect to service endpoint: "
- I’ve added the following attributes in the advanced options:
fs.s3a.access.key
fs.s3a.secret.key
fs.s3a.endpoint
fs.s3a.impl = org.apache.hadoop.fs.s3a.S3AFileSystem
fs.s3a.path.style.access = true
hadoop.fs.s3a.aws.credentials.provider = org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider
But it fails to query with error “com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: org.apache.hadoop.fs.s3a.AWSClientIOException: doesBucketExist on dwh-test: com.amazonaws.SdkClientException: Unable to execute HTTP request: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target: Unable to execute HTTP request: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target”
after adding
fs.s3a.connection.ssl.enabled = false
it fails with error “SocketException: Connection reset”