Hi,
I’ve installed a Dremio cIuster on AWS.
I have IAM role assigned to each node which allow Get* & List* actions on few S3 buckets.
I verified using awscli that i’m able to download objects and list the buckets.
When I try to add an S3 source with Instance Metadata set, it fails and I get the following error in the log:
Caused by: java.util.concurrent.ExecutionException: com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: 4603113E25A6E409), S3 Extended Request ID: y4Fta+eFpxn0IRPwjKd++Ldb780G1TfMklTgIp0icLtcxz2oqfmiDyS3eF57Dj6VG62fPkN//M0=
at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:500) ~[guava-20.0.jar:na]
at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:401) ~[guava-20.0.jar:na]
at com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(AbstractFuture.java:83) ~[guava-20.0.jar:na]
at com.google.common.util.concurrent.ForwardingFuture.get(ForwardingFuture.java:68) ~[guava-20.0.jar:na]
at com.google.common.util.concurrent.AbstractCheckedFuture.checkedGet(AbstractCheckedFuture.java:104) ~[guava-20.0.jar:na]
at com.dremio.exec.catalog.CatalogServiceImpl.createOrUpdateSource(CatalogServiceImpl.java:627) [dremio-sabot-kernel-3.1.6-201903070042400578-fdcd3a8.jar:3.1.6-201903070042400578-fdcd3a8]
… 55 common frames omitted
Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: 4603113E25A6E409)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1588) ~[aws-java-sdk-core-1.11.156.jar:na]
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1258) ~[aws-java-sdk-core-1.11.156.jar:na]
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1030) ~[aws-java-sdk-core-1.11.156.jar:na]
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:742) ~[aws-java-sdk-core-1.11.156.jar:na]
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:716) ~[aws-java-sdk-core-1.11.156.jar:na]
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699) ~[aws-java-sdk-core-1.11.156.jar:na]
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667) ~[aws-java-sdk-core-1.11.156.jar:na]
at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649) ~[aws-java-sdk-core-1.11.156.jar:na]
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513) ~[aws-java-sdk-core-1.11.156.jar:na]
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4221) ~[aws-java-sdk-s3-1.11.156.jar:na]
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4168) ~[aws-java-sdk-s3-1.11.156.jar:na]
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4162) ~[aws-java-sdk-s3-1.11.156.jar:na]
at com.amazonaws.services.s3.AmazonS3Client.getS3AccountOwner(AmazonS3Client.java:905) ~[aws-java-sdk-s3-1.11.156.jar:na]
at com.amazonaws.services.s3.AmazonS3Client.getS3AccountOwner(AmazonS3Client.java:896) ~[aws-java-sdk-s3-1.11.156.jar:na]
at com.dremio.plugins.s3.store.S3FileSystem.setup(S3FileSystem.java:104) ~[dremio-s3-plugin-3.1.6-201903070042400578-fdcd3a8.jar:3.1.6-201903070042400578-fdcd3a8]
at com.dremio.plugins.util.ContainerFileSystem.initialize(ContainerFileSystem.java:155) ~[dremio-s3-plugin-3.1.6-201903070042400578-fdcd3a8.jar:3.1.6-201903070042400578-fdcd3a8]
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2812) ~[hadoop-common-2.8.3.jar:na]
I saw in the docs that it says:
To list your AWS account’s S3 buckets as a source, you must provide your AWS credentials in the form of your access and secret keys
Does this mean I can’t use IAM roles? In our company we try to avoid using access/secret keys where possible…
Can you please suggest a way to solve this issue?
Thanks,
Rafi