Fake S3 support?

Hello,

Tried to make Dremio connect to our “fake s3” by setting the “fs.s3a.endpoint” property but it still points to Amazon.

Any idea if we can use S3 connector with an fake S3 provider? (Pithos on Cassandra or Minio).

Caused by: org.apache.http.conn.ConnectTimeoutException: Connect to s3.amazonaws.com:80 [s3.amazonaws.com/52.216.84.213] failed: connect timed out
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:150) ~[httpclient-4.5.2.jar:4.5.2]
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353) ~[httpclient-4.5.2.jar:4.5.2]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_161]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_161]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_161]

Not currently, though this is something we are looking at.

Hadoop seems to have this covered. So you could get to S3 (fake S3) by mounting it in Hadoop and HDFS in Dremio. But that’s twisted a bit …

When will the ability to provide connection properties like fs.s3a.endpoint be supported?

I would like to use the S3 connector to connect to Oracle Object Storage.
https://docs.cloud.oracle.com/iaas/Content/Object/Tasks/s3compatibleapi.htm

Support for S3 compatible products (like Minio) is on our roadmap, but timing is still TBD.